Hello. I have a scripting query that I am stumped on which I hope you can help with.
Basically, I have a ksh script that calls a process to create n number of binary files. These files have a maximum size of 1Gb. The process can write n number of files at once (parallel operation) based on the paralellisation parameter fed into the script at the start. Normally we would wait for this process to complete and then gzip all the files individually (gzip *.dmp for example). However, on some systems we don't have enough disk space to wait until all the 1Gb files have been produced.
I have previously written some code to gzip the files in parallel (see below), however, I now need to gzip them in parallel whilst the first process runs. I need to be careful not to attempt to gzip any files currently being written (up to n from the parallel command), so some sort of looping will be required. And I want to maintain the option of parallel gzip if possible.
Code:
...
gzip_func() {
started=0
threads=4
for filename in `ls -1 ${EXP_DIR}/*.dmp`
do
if [[ ${started} -lt ${threads} ]]; then
let started=started+1
echo "gzip ${filename}"
( $GZIPCMD ${filename} ) &
list_of_pids="${list_of_pids} $!"
else
print "wait ${list_of_pids}"
wait ${list_of_pids}
list_of_pids=""
started=0
fi
done
}
...
my_binary_file_creation_process
...
while [ `find ${EXP_DIR} -name \*.dmp|wc -l` -gt "0" ]; do
gzip_func
print "wait ${list_of_pids}"
wait ${list_of_pids}
list_of_pids=""
done
Can anyone help me write some code for this using standard solaris 8/9/10 tools using the korn shell. Perl commands should be possible (vers 5.6.1 installed).
Hello Everyone,
Here is what I am trying to do. I have four text files, I want to gzip them under unix and mail the zipped file via outlook. I am able to do this easily enough, but using winzip or pkunzip to unzip the file, there is only one file. (In essence, all four files were... (2 Replies)
The windows version of gzip supports pretty much unlimited file sizes while the one we have in solaris only goes up to a set size, one or two gigs I think.
Is there a new version of gzip I can put on our systems that supports massive file sizes? (2 Replies)
Hi,
There are multiple files in a directory with different names.How can they be gzipped such that the timestamp of the files is not changed. (2 Replies)
Hello experts,
I run Solaris 9. I have a below script which is used for gunzip the thousand files from a directory.
----
#!/usr/bin/sh
cd /home/thousands/gzipfiles/
for i in `ls -1`
do
gunzip -c $i > /path/to/file/$i
done
----
In my SAME directory there thousand of GZIP file and also... (4 Replies)
Is there any way to compress only the files with .xml extension within a folder which in turn has many sub folders?
gzip -r9 path/name/*.xml is not working
This compression is done in the Windows server using Batch script. (2 Replies)
Hi,
I have 1000 of files in a folder with the file extension as .csv
In this some of the files are already zipped and its looks like filename.csv.gz
Now i need to zip all the files in the folder to free some disk space. When i give gzip *.csv
It prompts me to overwrite filename.csv.gz... (5 Replies)
Hi,
I am using the commande line find . -name "*.nc" -type f -exec gzip -v {} \; to zip all files with the extension " *.nc " in all directories.
But I am looking for a way to excluded some directories as the command will recursively check all of them.
If somone can help me with some... (4 Replies)
Hi All,
I have a random test file: test.txt, size: 146
$ ll test.txt
$ 146 test.txt
Take 1:
$ cat test.txt | gzip > test.txt.gz
$ ll test.txt.gz
$ 124 test.txt.gz
Take 2:
$ gzip test.txt
$ ll test.txt.gz
$ 133 test.txt.gz
As you can see, gzipping a file and piping into gzip... (1 Reply)
Hi Guys,
I am using RHEL5 and Solaris 9 & 10.
I want to tar and gzip my files then remove them after a successful tar command...
Lets say I have files with extension .arc then I want to tar and gzip these files.
After successful tar command I want to remove all these files (i.e .arc).
... (3 Replies)
Hi,
I want to display the file names and the record count for the files in the 2nd column for the files created today.
i have written the below command which is listing the file names. but while piping the above command to the wc -l command
its not working for me.
ls -l... (5 Replies)
Discussion started by: Showdown
5 Replies
LEARN ABOUT MOJAVE
perlio::gzip
gzip(3) User Contributed Perl Documentation gzip(3)NAME
PerlIO::gzip - Perl extension to provide a PerlIO layer to gzip/gunzip
SYNOPSIS
use PerlIO::gzip;
open FOO, "<:gzip", "file.gz" or die $!;
print while <FOO>; # And it will be uncompressed...
binmode FOO, ":gzip(none)" # Starts reading deflate stream from here on
DESCRIPTION
PerlIO::gzip provides a PerlIO layer that manipulates files in the format used by the "gzip" program. Compression and Decompression are
implemented, but not together. If you attempt to open a file for reading and writing the open will fail.
EXPORT
PerlIO::gzip exports no subroutines or symbols, just a perl layer "gzip"
LAYER ARGUMENTS
The "gzip" layer takes a comma separated list of arguments. 4 exclusive options choose the header checking mode:
gzip
The default. Expects a standard gzip file header for reading, writes a standard gzip file header.
none
Expects or writes no file header; assumes the file handle is immediately a deflate stream (eg as would be found inside a "zip" file)
auto
Potentially dangerous. If the first two bytes match the "gzip" header "x1fx8b" then a gzip header is assumed (and checked) else a
deflate stream is assumed. No different from gzip on writing.
autopop
Potentially dangerous. If the first two bytes match the "gzip" header "x1fx8b" then a gzip header is assumed (and checked) else the
layer is silently popped. This results in gzip files being transparently decompressed, other files being treated normally. Of course,
this has sides effects such as File::Copy becoming gunzip, and File::Compare comparing the uncompressed contents of files.
In autopop mode Opening a handle for writing (or reading and writing) will cause the gzip layer to automatically be popped.
Optionally you can add this flag:
lazy
For reading, defer header checking until the first read. For writing, don't write a header until the first buffer empty of compressed
data to disk. (and don't write anything at all if no data was written to the handle)
By default, gzip header checking is done before the "open" (or "binmode") returns, so if an error is detected in the gzip header the
"open" or "binmode" will fail. However, this will require reading some data, or writing a header. With lazy set on a file opened for
reading the check is deferred until the first read so the "open" should always succeed, but any problems with the header will cause an
error on read.
open FOO, "<:gzip(lazy)", "file.gz" or die $!; # Dangerous.
while (<FOO>) {
print;
} # Whoa. Bad. You're not distinguishing between errors and EOF.
If you're not careful you won't spot the errors - like the example above you'll think you got end of file.
lazy is ignored if you are in autopop mode.
AUTHOR
Nicholas Clark, <nwc10+perlio-gzip@colon.colondot.net>
SEE ALSO
perl, gzip, rfc 1952 <http://www.ietf.org/rfc/rfc1952.txt> (the gzip file format specification), rfc 1951
<http://www.ietf.org/rfc/rfc1951.txt> (DEFLATE compressed data format specification)
perl v5.18.2 2006-10-01 gzip(3)