compress files as they are created by a dbexport


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting compress files as they are created by a dbexport
# 1  
Old 09-03-2011
Question compress files as they are created by a dbexport

hi,

we work on an IBM machine, and we have an INFORMIX datbase, we need to make a dbexport which create a flat files according to the tables we have.

we are on AIX OS using ksh 5.3

we need to make the dbexport and the compress (gzip) in the same time

i tried to make a shell but i'm sure there's better to do

Code:
#!/usr/bin/ksh

comp()
{
# wait until the dbexport dirctory is created with at least on file in it

while [ ! -d /dbex_dir_path/dbexport_dir ] || [ -z /dbex_dir_path/dbexport_dir ]
do
sleep 3
done


# compress the files before the end of the dbexport

while [ $(my_pgm_path/finished) != 1 ]
do

#compare the older non-nul file and non-compressed with the newer one (probably the currently created file)
# i do this test to prevent compressing the currently created file
if [[ $(ls -lt /dbex_dir_path/dbexport_dir|awk '$5!=0{print $9}'|grep -v .gz$|tail -1) != $(ls -ltr /dbex_dir_path/dbexport_dir|awk '$5!=0{print $9}'|grep -v .gz$|tail -1) ]] 

then

#compress the older non-nul and non- compressed file
gzip /dbex_dir_path/dbexport_dir/$(ls -lt $dos/bea_prod.exp|awk '$5!=0{print $9}'|grep -v .gz$|tail -1)


fi

done

# the dbexport is over so i can compress all the remaining files
for fich in $(ls -l /dbex_dir_path/dbexport_dir|awk '$5!=0{print $9}'|grep -v .gz$)
do
#compress 
gzip  /dbex_dir_path/dbexport_dir/$fich
echo "fin de la compression "
done

}


# i couldn't make a global variable so i used a file to specify that the dbexport hasn't succed yet
echo "0"> /my_pgm_path/finished

[ ! -d $dos ] || rm -rf $dos


# launch the compress process in the background
compy &


# dbexport
<dbexort-process> 

# dbexport is finished so i changed the "0" to "1" in my file
echo "1">/my_pgm_path/finished

i want if it's possible to make 4 parallel compress at once but i can't figure out how i can do it
# 2  
Old 09-05-2011
There is more relative value in tools that import than those that export. If you select delimited text you might find it easy to compress on the fly, reducing latency/end-to-end time. Not sure if dbexport would write a named pipe, with the gzip on the other side. Maybe use ksh/bash '>(...)' to make the named pipes on the fly. No sense writing the uncompressed data to disk!

You could use 'parallel' to get N parallel runs. A more primitive shell trick is to have a process writing table names to a pipe feeding to parens in which there are 4 background subshells reading names using 'line' one at a time and export/gzipping them. Send the names in biggest first. The biggest may take more time than all others!
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. HP-UX

Compress dbexport on the fly

Hi, I have an old HPUX 10.20 server running Informix 7.23 I need to dump the database to get it off that hardware before it dies. Unfortunately there is insufficient local diskspace to do so. I have set up a linux box with sufficient disk onto which I can export the database. Having... (1 Reply)
Discussion started by: fella
1 Replies

2. Shell Programming and Scripting

Compress a particular month files

Hi ALL, I am working on a folder where there are lot of files for the past one year. I need to compress a particular month files alone. suppose i need to compress the feb month files alone, what is the script we can use. Thanks in advance (2 Replies)
Discussion started by: acronis.84
2 Replies

3. UNIX for Dummies Questions & Answers

Issue: Compress in unix server and FTP to windows and open the compress file using Winzip

Hi All ! We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows. I have used the utility ZIP like the below. zip -e <newfilename> df2_test_extract.dat but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Discussion started by: sakthifire
4 Replies

4. UNIX for Dummies Questions & Answers

How to compress files without extension

Could someone please help? I'm trying to compress all the files in a directory without extension. I know for typical files with extension, the command is something like: tar -zcvf file.tar.gz *.doc What is the command for files without extension? Thanks. (4 Replies)
Discussion started by: AChan1118
4 Replies

5. Shell Programming and Scripting

compress files

Could someone give me an idea how to compress all files from a given directory that are not of type .z (compressed). Please help. (2 Replies)
Discussion started by: lesstjm
2 Replies

6. Filesystems, Disks and Memory

Compress files on NAS

Hello, I am having difficulty compressing the files using compress or GZIP utility on NAS share NFS mounted on my linux server. Any one have idea on how to do this ? I get the followign error but the trying to compress the files STRP2> compress STR_OUTBOUND_CDM_LOG_LOB.PRT2008_26.txt... (0 Replies)
Discussion started by: kamathg
0 Replies

7. UNIX for Dummies Questions & Answers

Compress files

Hi All, I would like to archive some of the scripts below(USFINUM042006_01.CSV USFINUM042006_02.CSV and USFINUM042006_03.CSV )and also use a wildcard e.g. <command> USFINUM*.CSV. Also there are a lot of similar files but I want only the three latest files to be compressed. Which is the best... (3 Replies)
Discussion started by: indira
3 Replies

8. Shell Programming and Scripting

Compress multiple files

Hi Friends, Can anyone help me out with compressing multiple files. I have multiple files in directory , I have to compress these into a single file, I tried using gzip -r outfile.gz file1 file2 file3. It is not working Thanks in advance for your help S :) (5 Replies)
Discussion started by: sbasetty
5 Replies

9. Shell Programming and Scripting

Compress all INACTIVE Files

Hi gurus, I have a batch job running daily night automatically. The job produces file with extension '.0' like a.0, b.0 etc. Now due to file space constraints, most of the time, the job fails with insufficient disk space and then we have to manually start the job again and keep running the... (1 Reply)
Discussion started by: super_duper_guy
1 Replies

10. UNIX for Dummies Questions & Answers

How to Compress log files?

Hi, I have my log files in /home/user1/temp2/logs i want to archive *.log and *.txt files and to store in my /home/user1/temp2/archved/ with *.log with Time stamp ,Please let me know how to do this? (1 Reply)
Discussion started by: redlotus72
1 Replies
Login or Register to Ask a Question