Find all log files under all file systems older than 2 days and zip them


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Find all log files under all file systems older than 2 days and zip them
# 1  
Old 05-22-2013
Hammer & Screwdriver Find all log files under all file systems older than 2 days and zip them

Hi All,

Problem Statement:Find all log files under all file systems older than 2 days and zip them. Find all zip files older than 3days and remove them. Also this has to be set under cron.

I have a concerns here
Code:
find . -mtime +2 -iname "*.log" -exec gzip {}

Not sure if this will work as the files will be having different permissions. Does it need to be configured at root level for it to perform the activity for all user files on the server

please help

Last edited by Scott; 05-22-2013 at 07:09 PM.. Reason: Moved post. Please do not hijack threads
# 2  
Old 05-23-2013
Quote:
Does it need to be configured at root level for it to perform the activity for all user files on the server
I am tempted to say no, but I dont know your system... If someone unset read for all on such files and you do not belong to the specified group then it cannot be read... and so the answer would be you must be root...
Now who is the silly guy going to do such thing on log files? If you do so ( yes I do have to is some cases...) then you set also the group ownership for the people that can read the logs... and so you dont need to be root, only be member of the good group...
This User Gave Thanks to vbe For This Post:
# 3  
Old 05-24-2013
With remove:
Code:
find . -type f \( -mtime +3 -iname "*.log.gz" -exec echo rm -f {} \; -o -mtime +2 -iname "*.log" -exec echo gzip {} \; \)

Remove the echo if you think it works.
Without root rights it will need to read the existing files and write a new file in the current directory, and it will be owned by the creator. If it fails it will give stderr message and continue.
# 4  
Old 05-24-2013
Find all log files under all file systems older than 2 days and zip them

Thanks all,

I got it set at the root level. Works fine for me.
The situation was a little typical as this is a system where too many users log in and create log files and they all belong to different groups so coming down to one single group was a pain however that suggestion was helpful as it made me understand what I cannot do for sure in my current scenario.

Thanks a lot for the help guys. Smilie
# 5  
Old 08-29-2013
Error but what about already existing *log.gz files?

If you have some older <something>.log.gz files lying around in these dirs, the FIND commmand will pass those to GZIP and GZIP will try to re-compress those again and ask for each and every one of the files if I want to overwrite the existing <something>.log.gz file.

I tried FIND using -regex with ".*\.log\b" boundary to exclude the existing *.log.gz files but it did not work. Not sure why.

Unfortunately GZIP also does not have a --exclude parameter.

Does someone have an idea how to do this using Find and GZIP?
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to create zip/gz/tar files for if the files are older than particular days in UNIX or Linux?

I need a script file for backup (zip or tar or gz) of old log files in our unix server (causing the space problem). Could you please help me to create the zip or gz files for each log files in current directory and sub-directories also? I found one command which is to create gz file for the... (4 Replies)
Discussion started by: Mallikgm
4 Replies

2. Shell Programming and Scripting

Writte a script to copy the files older than 7 days using find and cp

Hi I'm trying to writte a script (crontab) to copy files from one location to another... this is what i have: find . -name "VPN_CALLRECORD_20130422*" | xargs cp "{}" /home/sysadm/patrick_temp/ but that is not working this is the ouput: cp: Target... (5 Replies)
Discussion started by: patricio181
5 Replies

3. Shell Programming and Scripting

Delete log files content older than 30 days and append the lastest date log file date

To delete log files content older than 30 days and append the lastest date log file date in the respective logs I want to write a shell script that deletes all log files content older than 30 days and append the lastest log file date in the respective logs This is my script cd... (2 Replies)
Discussion started by: sreekumarhari
2 Replies

4. UNIX for Dummies Questions & Answers

How do I find files which are older than 30 days and greater than 1GB

Hi All, I know the separate commands for finding files greater than 30 days and finding files greater than 1GB. How do I combine these two commands? Meaning how do I find files which are > 1GB and older than 30 days? ;) (4 Replies)
Discussion started by: Hangman2
4 Replies

5. UNIX for Advanced & Expert Users

find files older than 30 days old

Hello, I have a script which finds files in a directory that are older than 30 days and remove them. The problem is that these files are too many and when i run this command: find * -mtime +30 | xargs rm I run this command inside the directory and it returns the error: /usr/bin/find:... (8 Replies)
Discussion started by: omonoiatis9
8 Replies

6. UNIX Desktop Questions & Answers

Find files older than 10 days

What command arguments I can use in unix to list files older than 10 days in my current directory, but I don't want to list the hidden files. find . -type f -mtime +15 -print will work but, it is listing all the hidden files., which I don't want. (4 Replies)
Discussion started by: Pouchie1
4 Replies

7. Shell Programming and Scripting

How to find files older than 30 days

Dear Friends, I have two queries. 1) I want to see the list of folders which were created 29 days ago. 2) I want to see the folders in which last created file is older than 29 days. Can it be done? Thank you in advance Anushree (4 Replies)
Discussion started by: anushree.a
4 Replies

8. Red Hat

Find files older than 30 days in directories and delete them

Hi, I have dummies questions: My script here can find the files in any directories older than 30 days then it will delete the files but not the directories. I would like to also be able to delete the directories that hold old files more than 30 days not just the files itself. find . -type f... (2 Replies)
Discussion started by: lamoul
2 Replies

9. Solaris

Find files older than x days and create a consolidated single tar file.

Hello, I need help in finding files older than x days and creating a single consolidated tar file combining them. Can anyone please provide me a script? Thanks, Dawn (3 Replies)
Discussion started by: Dawn Bosch
3 Replies

10. Shell Programming and Scripting

Find files older than 20 days & not use find

I need to find files that have the ending of .out and that are older than 20 days. However, I cannot use find as I do not want to search in the directories that are underneath the directory that I am searching in. How can this be done?? Find returns files that I do not want. (2 Replies)
Discussion started by: halo98
2 Replies
Login or Register to Ask a Question