Sponsored Content
Top Forums Shell Programming and Scripting Need script to tar files older than 30 days Post 302374782 by jim mcnamara on Wednesday 25th of November 2009 12:17:43 PM
Old 11-25-2009
Code:
#!/bin/ksh 

     set -A mons nothing 12 01 02 03 04 05 06 07 08 09 10 11
     last_year=$(date +"%Y")
     typeset -i month=$( date "+%m")
     last_month=${mons[month]}
     if [[ "$last_month" = "12" ]] ; then
          last_year=$(( $last_year -1 ))
     fi
     touch -t ${last_year}${last_month}010000 dummy
     
     tar -cvf tarball.tar $( find /path/to/logs -name 'stats*.txt' ! -newer dummy)

try that
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

tar files older than 30 days

Hi there, I am trying to tar a number of files held in a specific folder. I am only interested in archiving files older than 30 days. Having read through the man entries and all available documentation I thought I'd cracked the coomand with tar -c -z -v -N 15/04/2004 -f /wfch.tar * This... (6 Replies)
Discussion started by: wfch
6 Replies

2. Shell Programming and Scripting

Script to pick out files from a file which are older than 15 days

Hi , I have a file abc.txt which actually contains a directory and file list -rwxrwxr-x 1 dmadmin mmasapp 334 Dec 03 2001 aafs_006.ksh -rwxrwxr-x 1 dmadmin mmasapp 1270 Nov 13 2001 mcl_deposit_ftp.ksh -rwxrwxr-x 1 dmadmin mmasapp 925 Oct 31 2001 ... (1 Reply)
Discussion started by: viv1
1 Replies

3. Solaris

Find files older than x days and create a consolidated single tar file.

Hello, I need help in finding files older than x days and creating a single consolidated tar file combining them. Can anyone please provide me a script? Thanks, Dawn (3 Replies)
Discussion started by: Dawn Bosch
3 Replies

4. Shell Programming and Scripting

Delete files older than 2 days using shell script in Unix

I'm new to shell script.... can any one help... What is the shell script to delete the files older than 2 days ? (3 Replies)
Discussion started by: satishpabba
3 Replies

5. Shell Programming and Scripting

How to tar, compress and remove files older than two days

Hi, I'm Eddy from Belgium and I've the following problem. I try to write a ksh script in AIX to tar, compress and remove the original *.wav files from the directory belgacom_sf_messages older than two days with the following commands. The problem is that I do not find a good combination... (4 Replies)
Discussion started by: edr
4 Replies

6. Shell Programming and Scripting

script to remove files older than 60 days

Hi I need help in the script which looks at a contorl file which has a list of file names like xxxx.12345 and I want to take only xxxxx and search in a specific directory and remove the file if its older than 60 days I have written something like this.. but seems to be wrong... (1 Reply)
Discussion started by: antointoronto
1 Replies

7. Shell Programming and Scripting

Script to delete files in a folder older than 2 days

hi i need a script to delete the files older than 2 days... if my input is say in a folder versions A_14122012.txt A_15122012.txt A_16122012.txt A_17122012.txt i want my output to be A_16122012.txt A_17122012.txt thanks in advance hemanth saikumar. (2 Replies)
Discussion started by: hemanthsaikumar
2 Replies

8. Shell Programming and Scripting

Writte a script to copy the files older than 7 days using find and cp

Hi I'm trying to writte a script (crontab) to copy files from one location to another... this is what i have: find . -name "VPN_CALLRECORD_20130422*" | xargs cp "{}" /home/sysadm/patrick_temp/ but that is not working this is the ouput: cp: Target... (5 Replies)
Discussion started by: patricio181
5 Replies

9. Shell Programming and Scripting

How to create zip/gz/tar files for if the files are older than particular days in UNIX or Linux?

I need a script file for backup (zip or tar or gz) of old log files in our unix server (causing the space problem). Could you please help me to create the zip or gz files for each log files in current directory and sub-directories also? I found one command which is to create gz file for the... (4 Replies)
Discussion started by: Mallikgm
4 Replies

10. UNIX for Dummies Questions & Answers

Script for Deleting files older than 14 days automatically

we need to have periodic clean up implemented in our sap directory \\sapds\PR1\int\scm\snp\outbound\snapshot. which needs unix script for Deleting files older than 14 days automatically in sap system unix os. (1 Reply)
Discussion started by: kvkreddy_b2w
1 Replies
bup-join(1)						      General Commands Manual						       bup-join(1)

NAME
bup-join - concatenate files from a bup repository SYNOPSIS
bup join [-r host:path] [refs or hashes...] DESCRIPTION
bup join is roughly the opposite operation to bup-split(1). You can use it to retrieve the contents of a file from a local or remote bup repository. The supplied list of refs or hashes can be in any format accepted by git(1), including branch names, commit ids, tree ids, or blob ids. If no refs or hashes are given on the command line, bup join reads them from stdin instead. OPTIONS
-r, --remote=host:path Retrieves objects from the given remote repository instead of the local one. path may be blank, in which case the default remote repository is used. The connection to the remote server is made with SSH. If you'd like to specify which port, user or private key to use for the SSH connection, we recommend you use the ~/.ssh/config file. EXAMPLE
# split and then rejoin a file using its tree id TREE=$(tar -cvf - /etc | bup split -t) bup join $TREE | tar -tf - # make two backups, then get the second-most-recent. # mybackup~1 is git(1) notation for the second most # recent commit on the branch named mybackup. tar -cvf - /etc | bup split -n mybackup tar -cvf - /etc | bup split -n mybackup bup join mybackup~1 | tar -tf - SEE ALSO
bup-split(1), bup-save(1), ssh_config(5) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-join(1)
All times are GMT -4. The time now is 05:12 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy