Sponsored Content
Top Forums Shell Programming and Scripting How archive the older than 30 day files to another unix server Post 302683091 by zaxxon on Tuesday 7th of August 2012 07:44:04 AM
Old 08-07-2012
I asked you via PM to use code tags, sent a video with a guide and added a mod comment. You got another PM with a guide.

Quote:
based on the above code we will get only file name.
That's what you asked for.

Quote:
but when i am calling ftp function i getting file name along with directory
Define the variable after writing cut off names to the file:
Code:
...
echo ${FILE##*/} >> file_list ## This file will have the list of files copied and removed
FILE=${FILE##*/}
...

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Archive files older than 30days

I have files with the names BARE01_DLY_MKT_20060724 in the source directory /biddf/ab6498/dev/ctl. I need to archive folders older than 30days. Like if i have a file named BARE01_DLY_MKT_20060622 I need to move this to /biddf/ab6498/dev/ctl/archive. How can I do this. One more thing is that I... (8 Replies)
Discussion started by: dsravan
8 Replies

2. Shell Programming and Scripting

Deleting / finding files older than X days missess a day

Hi When trying to find and delete files which are, say, 1 day, the find command misses a day. Please refer the following example. xxxd$ find . -type f -ctime +1 -exec ls -ltr {} \; total 64 -rw-rw-r-- 1 oracle xxxd 81 Apr 30 11:25 ./ful_cfg_tmp_20080429_7.dat -rw-rw-r-- 1... (4 Replies)
Discussion started by: guruparan18
4 Replies

3. UNIX for Dummies Questions & Answers

sample script to archive & move previous day syslog files

hi all. Please help me with archiving previous day syslog files. the files have no extension and have the format YYYY-MM-DD. I want to archive the file then move it to some other machine. thanks. (2 Replies)
Discussion started by: coolatt
2 Replies

4. Shell Programming and Scripting

Find the number of files older than 1 day from a dir

Hello All, I need to write a script/command which can find out the number of .csv files residing in a directory older than 1 day. The output should come with datewise (means for each date how many files are there). I've this command, but this command gives the total number of files. It's... (10 Replies)
Discussion started by: NARESH1302
10 Replies

5. Shell Programming and Scripting

Find and delete files and folders which are n days older from one unix server to another unix server

Hi All, Let me know how can i find and delete files from one unix server to another unix server which are 'N' days older. Please note that I need to delete files on remote unix server.So, probably i will need to use sftp, but question is how can i identify files and folders which are 'N'... (2 Replies)
Discussion started by: sachinkl
2 Replies

6. Shell Programming and Scripting

scp files that are 3 days older from remote server-

hello, i am trying to get a list of files to be scped from the remote server by running the below in my local unix server ( note - there is a passwordless connectivity setup made between the local and remote server) and, we use KSH. --- ssh $scp_host "find /a/b/c/*/ -iname "$remote_file"" >... (4 Replies)
Discussion started by: billpeter3010
4 Replies

7. UNIX for Advanced & Expert Users

Help with get/mget from FTP server with files older than 10 minutes

Hi! I am new to unix and this forum as well.. Can someone please help me : I want to "get/mget" files which are older than 10 minutes from a remote FTP server like "ftp.com". After getting the files to local unix server say "Prod.com" , i need to delete only those files from ftp.com which... (4 Replies)
Discussion started by: SravsJaya
4 Replies

8. Shell Programming and Scripting

Sftp - 1 day older files count

Need to write a shell script on AIX box which will connect to different servers using SFTP and get the file count of only 1 day older files. (purging list) How to achieve this? On local server we can use: find <path> -type f -mtime +1 But how to do it in case of SFTP? Please advise. Thanks... (9 Replies)
Discussion started by: vegasluxor
9 Replies

9. Shell Programming and Scripting

Grep files older than 1 day

I thought that this would work for grep'ing files older than 1 day. ps -o etime,pid,user,args -e|awk '/^+-/'|sort -t- -n -k 1,1 |grep qdaemon |grep /usr/bin/ksh But, it is not grep'ing any of files (i.e. below) older than 1 day. d_prod 33757970 61999560 0 Oct 27 - 0:00... (8 Replies)
Discussion started by: Daniel Gate
8 Replies

10. Shell Programming and Scripting

Need command/script to archive files older than

I need to find a way to archive all files older than a given date but there are some conditions that are making it difficult for me to find the correct command: Linux based system (RH5) there are multiple layers of directory depth I need to search each file should be tar'd in it's original... (1 Reply)
Discussion started by: KaosJedi
1 Replies
STOREBACKUPDEL(1)					User Contributed Perl Documentation					 STOREBACKUPDEL(1)

NAME
storeBackupDel.pl - this program deletes backups created by storeBackup SYNOPSIS
storeBackupDel.pl [-f configFile] [--print] [-b backupDirectory] [-S series] [--doNotDelete] [--deleteNotFinishedDirs] [-L lockFile] [--keepAll timePeriod] [--keepWeekday entry] [--keepFirstOfYear] [--keepLastOfYear] [--keepFirstOfMonth] [--keepLastOfMonth] [--keepFirstOfWeek] [--keepLastOfWeek] [--keepDuplicate] [--keepMinNumber] [--keepMaxNumber] [-l logFile [--plusLogStdout] [--suppressTime] [-m maxFilelen] [[-n noOfOldFiles] | [--saveLogs] [--compressWith compressprog]] WARNING
!!! USAGE IN PARALLEL WITH storeBackup.pl CAN DESTROY YOUR BACKUPS !!! OPTIONS
--file, -f configuration file (instead of parameters) --print print configuration read from configuration file and stop --backupDir, -b top level directory of all backups (must exist) --series, -S directory of backup series same parameter as in storeBackup / relative path from backupDir, default is 'default' --lockFile, -L lock file, if exists, new instances will finish if an old is already running, default is $lockFile --doNotDelete test only, do not delete any backup --deleteNotFinishedDirs delete old backups which where not finished this will not happen if doNotDelete is set --keepAll keep backups which are not older than the specified amount of time. This is like a default value for all days in --keepWeekday. Begins deleting at the end of the script the time range has to be specified in format 'dhms', e.g. 10d4h means 10 days and 4 hours default = $keepAll; --keepWeekday keep backups for the specified days for the specified amount of time. Overwrites the default values chosen in --keepAll. 'Mon,Wed:40d Sat:60d10m' means: keep backups of Mon and Wed 40days + 5mins keep backups of Sat 60days + 10mins keep backups of the rest of the days like spcified in --keepAll (default $keepAll) if you also use the 'archive flag' it means to not delete the affected directories via --keepMaxNumber: a10d4h means 10 days and 4 hours and 'archive flag' e.g. 'Mon,Wed:a40d Sat:60d10m' means: keep backups of Mon and Wed 40days + 5mins + 'archive' keep backups of Sat 60days + 10mins keep backups of the rest of the days like specified in --keepAll (default $keepAll) --keepFirstOfYear do not delete the first backup of a year format is timePeriod with possible 'archive flag' --keepLastOfYear do not delete the last backup of a year format is timePeriod with possible 'archive flag' --keepFirstOfMonth do not delete the first backup of a month format is timePeriod with possible 'archive flag' --keepLastOfMonth do not delete the last backup of a month format is timePeriod with possible 'archive flag' --firstDayOfWeek default: 'Sun'. This value is used for calculating --keepFirstOfWeek and --keepLastOfWeek --keepFirstOfWeek do not delete the first backup of a week format is timePeriod with possible 'archive flag' --keepLastOfWeek do not delete the last backup of a week format is timePeriod with possible 'archive flag' --keepDuplicate keep multiple backups of one day up to timePeriod format is timePeriod, 'archive flag' is not possible default = $keepDuplicate; --keepMinNumber Keep that miminum of backups. Multiple backups of one day are counted as one backup. Default is 10. --keepMaxNumber Try to keep only that maximum of backups. If you have more backups, the following sequence of deleting will happen: - delete all duplicates of a day, beginning with the old once, except the oldest of every day - if this is not enough, delete the rest of the backups beginning with the oldest, but *never* a backup with the 'archive flag' or the last backup --keepRelative, -R Alternative deletion scheme. If you use this option, all other keep options are ignored. Preserves backups depending on their *relative* age. Example: -R '1d 7d 2m 3m' will (try to) ensure that there is always - One backup between 1 day and 7 days old - One backup between 5 days and 2 months old - One backup between 2 months and 3 months old If there is no backup for a specified timespan (e.g. because the last backup was done more than 2 weeks ago) the next older backup will be used for this timespan. --logFile, -l log file (default is STDOUT) --plusLogStdout if you specify a log file with --logFile you can additionally print the output to STDOUT with this flag --suppressTime suppress output of time in logfile --maxFilelen, -m maximal length of file, default = 1e6 --noOfOldFiles, -n number of old log files, default = 5 --saveLogs save log files with date and time instead of deleting the old (with [-noOldFiles]) --compressWith compress saved log files (e.g. with 'gzip -9') default is 'bzip2' COPYRIGHT
Copyright (c) 2003-2008 by Heinz-Josef Claes (see README). Published under the GNU General Public License v3 or any later version perl v5.14.2 2012-06-16 STOREBACKUPDEL(1)
All times are GMT -4. The time now is 02:20 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy