I don't know what command DB_DATA does. Is that important?
I don't see why you count the items to test if you need to do the work. If the second find command does not select anything, the -exec section does not run.
If the depth of directory/filename gets very long, your commands may fail. It might be better to change to $DB_DATA_PATH, check that you have done so and then run find with a directory of .
I'm not sure you need to have single quotes around the '{}' or ';'
When you say it's 'not working', in what way is it not? Do you have any output/errors?
For a dummy run, use .... -exec echo gzip ...... so you make sure you don't break anything whilst working on it.
In your first find, you have options -maxdepth 1 -mtime -1 set when you are counting, but not when you execute the second find. Which is the correct configuration?
I'm sure we can help you work this through to achieve what you want.
On top of what rbatte1 said (esp. list item 5!), you're talking of "compress ... into its own tar" but not using the tar command anywhere in your code?
I don't know what command DB_DATA does. Is that important?
I don't see why you count the items to test if you need to do the work. If the second find command does not select anything, the -exec section does not run.
If the depth of directory/filename gets very long, your commands may fail. It might be better to change to $DB_DATA_PATH, check that you have done so and then run find with a directory of .
I'm not sure you need to have single quotes around the '{}' or ';'
When you say it's 'not working', in what way is it not? Do you have any output/errors?
For a dummy run, use .... -exec echo gzip ...... so you make sure you don't break anything whilst working on it.
In your first find, you have options -maxdepth 1 -mtime -1 set when you are counting, but not when you execute the second find. Which is the correct configuration?
I'm sure we can help you work this through to achieve what you want.
Kind regards,
Robin
Hi Robin,
Many thanks for your reply.
DB_DATA_PATH is the code for the sub directory that is created to store the backup file of an application database backup.
So, the backup database actual path is /db/shr/bcksrv/bck/IPAddress/DBX/Foldera.
Here is one of the sub directories is
/db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527.
The /db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527 consists of two sub directories, i.e. ept & ine. In these both directories, there are some backup files of the application database.
In this sub directory, the available files are only backup file folders. Each backup file folder consists of 2 sub folders, i.e. exp & ine folders.
I use manual compressing command as follows:
zip -r bck_F120170527.zip /db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527/ept /db/shr/bcksrv/bck/IPAddress/DBX/Foldera/bck_F120170527/ine
but now, I want to compress the main automatically every 2 days.
I using this new modified code but not success:
the new sh command is still not working. Any helps to repair the code are really appreciate. Thank you.
---------- Post updated at 12:36 PM ---------- Previous update was at 12:18 PM ----------
Quote:
Originally Posted by RudiC
On top of what rbatte1 said (esp. list item 5!), you're talking of "compress ... into its own tar" but not using the tar command anywhere in your code?
Thanks for your good info. I have repaired the code now. Pls help to check my code.
.
.
.
the new sh command is still not working.
.
.
.
Please become accustomed to provide decent context info of your problem.
You've been asked to explain WHAT and HOW is not working, and to show, if existent, system (error) messages verbatim, to avoid ambiguities and keep people from guessing.
EDIT: You seem to supply two -f options to tar, and no file(s) to archive.
Please become accustomed to provide decent context info of your problem.
You've been asked to explain WHAT and HOW is not working, and to show, if existent, system (error) messages verbatim, to avoid ambiguities and keep people from guessing.
EDIT: You seem to supply two -f options to tar, and no file(s) to archive.
Yes, sure. I actually want to compress every file in a directory into its own tar whilst preserving the name for each file automatically using cron job.
To do this, I am using the sh code that I have pasted here but the code is not working. I am sure the code is wrong.
So, to answer your question: WHAT is not working , it is the code or the sh code that I have pasted here.
HOW is not working,i.e. it is because the code is wrong.
If the code is not working, I need the new sh code from all of your here. If it is different with mine, it is okay but it should be worked.
If it is not available, pls help to repair my sh code. I really appreciate it. Tqvm
Rgds,
Steven
Last edited by Steven_2975; 06-30-2017 at 09:56 PM..
Hi,
When we want to compress a file which is of huge size then what command is best for us.Kindly suggest on this.
1.Tar command or
2.gzip command
OS -- Linux 2.6
Regards,
Maddy (6 Replies)
Hi All !
We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows.
I have used the utility ZIP like the below.
zip -e <newfilename> df2_test_extract.dat
but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Hi all,
I wanted to know how to compress a .zlib file..
an working on unix
so pls suggest accordingly.
the file is pretty big(500 mb)
also i would like to know any weblinks where i can get more info on various compressions commands in unix
i have tried zip,compress commands
but i have not... (1 Reply)
Hi,
I am looking for the unix script which can takes the 2 month old data from a TXT file (there is one txt file in whiche messages are appended on daily basis) and compress them into new file.Please halp me out. (2 Replies)
Dear All,
I have to compress entire files of folder is size of 1.10 GB. It is to be held on Windows NT machine.
I have a folder called /folder1/. It contains around 200 files of size 1.10 gb. I want to compress those files in a zipped file. Compress process to be an automated process. Because... (0 Replies)
hi,
I need to compress a couple files in a directory.i tried using
tar cvf filename1 filename2
but i am not able to open tar file.
please suggest how to compress more than one file in to same file
thanks in advance (2 Replies)
Hi,
I have a script that ftp's to over 100 deifferent servers in turn, gets a specific file, renames it and drops it onto a local backup server. The files vary in size from 4mb to 150mb. I am within a secure intranet to security with ftp is not an issue. I want to auto compress the file... (1 Reply)
im doing a script to compress files in ${CompressPath} withe files older than ${FileAge}. The line below actually works, but I only need to compress files that are in ${CompressPath}. This line compresses all files that it can find under the ${CompressPath} and all its sub dirs. is there a way to... (6 Replies)