transfer files to amazon s3 based on counterfile


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting transfer files to amazon s3 based on counterfile
# 1  
Old 04-25-2012
transfer files to amazon s3 based on counterfile

This is my first attempt at a shell script, and honestly I'm not even sure if its a shell or a bash.

Don't penalise me too much, I'm learning and I'm going at a slow pace. Below is my effort at a shell script that transfers files using another perl module written by a much talented author, Timothy Kay found here timkay[dot]com/aws/

To save you the read, it's use is a simple as /usr/bin/s3put bucket/destination/ source/source.gz

My idea for the script is it sends backups (already generated by cpanel) to my S3 bin. Each day it will check with the counter file, how far in the month it is and take the appropriate action.

Since I dont have a test enviroment, only my production server, I was hoping some pro's could look this over and see and obvios blunders on my part. I've tried to comment where nessesary.

Code:
#!/bin/bash
## Transfer already created backups to s3

#Backup daily, weekly
(. /home/tools/counterfile
day=$(( $day + 1 ))
if [ $day == 28 ]
then day=0
fi

#If should make a first second and fouth week - basically a monthly backup.
#one daily backup 
#One weekly  <14 days old but >7 days old
#one backup 2 weeks old
#One backup 1 month old.
# Log files for all actions - seperate log files for each week

##Define $logfile, $bucket
logfile=/home/tools/logs/s3backup_week"$week".log
bucket="mybucket"
(if [ -e "$logfile" ]; then
  rm -f "$logfile"
fi)

/usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
while read user; do
echo `date '+%D %T'` "Current user is: "$user >> $logfile
#Send Daily Backup to S3
/usr/bin/s3put "$bucket"/"$user"/daily/"$user".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
echo  `date '+%D %T'` "Copied Daily to /"$user"/"$user"-daily.tar.gz \n" >> $logfile

#Determine week and push approprietly.
if [ $day -le 6 ]; then 
    #First week
    $week=0
    /usr/bin/s3put "$bucket"/"$user"/week-1/"$user".tar.gz /backup/cpbackup/weekly/"$user".tar.gz >> $logfile 2>&1
    echo -e `date '+%D %T'` "Copied weekly to /"$user"/"$user"-weekday"$day".tar.gz \n" >> $logfile
    
else 
    if [ $day -le 14 ]; then
        #Second week
        $week=1
        /usr/bin/s3put "$bucket"/"$user"/week-2/"$user".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
        echo -e `date '+%D %T'` "Copied weekly to /"$user"/"$user"-weekday"$day".tar.gz \n" >> $logfile
        
    else 
        if [ $day -le 28 ]; then
            #Monthly backup
            $week=2
            /usr/bin/s3put "$bucket"/"$user"/monthly/"$user".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
            echo -e `date '+%D %T'` "Copied weekly to /"$user"/"$user"-weekday"$day".tar.gz \n" >> $logfile
fi

echo "week=$week" > /home/tools/counterfile
. /home/tools/counterfile

done
)

Help much appricated.

---------- Post updated 25-04-12 at 01:32 PM ---------- Previous update was 24-04-12 at 11:22 PM ----------

I figured some fualty reasoning on my part, so changed the specs a little.

Instead of weekly and every two weeks, why not just every 10 days; for 3 cycles, thus providing a monthly as well.

is this logic sound then:

Code:
if [ $day -le 10 ]; then      
#backup-a     
else
if [ $day -le 20 ]; then
   #backup-b
else      
if [ $day -le 30 ]; then    
#backup-c
fi


Last edited by 3xad3u5; 04-25-2012 at 08:34 AM.. Reason: no code tag
# 2  
Old 04-25-2012
Logic question
If $day is less than 10 - then for 1 - 9 your process will run backup -a, so you get nine backups. Is that what you want?

Programming comment:
There is a really good programming concept, sometimes called "DRY" - Don't Repeat Yourself. This block of code is repeated several times

Code:
           /usr/bin/s3put "$bucket"/"$user"/monthly/"$user".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
            echo -e `date '+%D %T'` "Copied weekly to /"$user"/"$user"-weekday"$day".tar.gz \n" >> $logfile

If you create a shell function you can pass it parameters for the one or two strings that are different each time.
Code:
do_backup()
{
     dir=$1
     comment="$2"
     dir2=$3
           /usr/bin/s3put "$bucket"/"$user"/$dir/"$user".tar.gz /backup/cpbackup/$dir2/"$user".tar.gz >> $logfile 2>&1
            echo -e `date '+%D %T'` "$comment" /"$user"/"$user"-weekday"$day".tar.gz \n" >> $logfile
}

#usage
do_backup monthly 'Copied weekly to'  daily

Anyway, this lets you quickly see that one of your calls is interesting: monthly weekly and daily for exactly the same operation. Hmm. This may be correct -- but a priori it looks wrong. This is the real benefit of the whole DRY idea.

Find what you are repeating, then feed it the small changes as parameters.
This User Gave Thanks to jim mcnamara For This Post:
# 3  
Old 04-25-2012
Quote:
Originally Posted by jim mcnamara
Logic question
If $day is less than 10 - then for 1 - 9 your process will run backup -a, so you get nine backups. Is that what you want?
Oh i get it. I only want it to run once in that 10 day cycle, so at the end on day 30, I have 4 backups, one 30 days old, one 20 days old, one 10 days old and one copied from the latest one (todays backup copied across).

Will hte following function be correct then?
Code:
#shell function
do_backup()
{
     bkpname=$1
           /usr/bin/s3put "$bucket"/"$user"/"$bkpname"-"$user".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
            echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$bkpname"-"$user".tar.gz \n" >> $logfile
}

#usage
#do_backup segment1
#do_backup daily

Following your logic, is this a good method DRY?
Code:
do_backup daily
if [ $day -eq 10 ]; then       
do_backup segmant1 
else
if [ $day -eq 20 ]; then   
do_backup segmant2 
else       
if [ $day -eq 30 ]; then     
do_backup segmant3 
fi

Totally agree with DRY although its not 'new' to me I see I am doing it... Could I use the function you wrote within my script? Or should it be a new file on its own, that I call within the backup.sh file?

---------- Post updated at 05:07 PM ---------- Previous update was at 02:39 PM ----------

Ok, I've updated my script and have the following:
Code:
#!/bin/bash
## Transfer already created backups to s3
#Backup daily, and rolling 3 x 10 day cycles.

#shell function
do_rws_backup()
{
     bkpname=$1
           /usr/bin/s3put "$bucket"/"$user"/"$bkpname"-"$user".tar.gz /backup/cpbackup/daily/"$user".tar.gz >> $logfile 2>&1
            echo -e `date '+%D %T'` "Copied "$bkpname" backup to "$user"/"$bkpname"-"$user".tar.gz \n" >> $logfile
}

#usage
#do_rws_backup segment1
#do_rws_backup daily

(. /home/tools/counterfile
day=$(( $day + 1 ))
if [ $day == 30 ]
then day=0
fi
echo "day=$day" > /home/tools/counterfile
. /home/tools/counterfile

#one daily backup 
#3 Rolling backups made with 10 day intervals
# Log files for all actions - seperate log files for each week

##Define $logfile, $bucket
logfile=/home/tools/logs/s3backup_week"$day".log
bucket="rws-delta-backup"
(if [ -e "$logfile" ]; then
  rm -f "$logfile"
fi)

/usr/bin/find /var/cpanel/users -type f -printf "%f\n" |
while read user; do
echo `date '+%D %T'` "Current user is: "$user >> $logfile

#Send Daily Backup to S3
do_rws_backup daily

#Do backups rolling 10 day intervals
    if [ $day -eq 10 ]; then       
        do_rws_backup segmant1 
    else
        if [ $day -eq 20 ]; then   
            do_rws_backup segmant2 
        else       
            if [ $day -eq 30 ]; then     
                do_rws_backup segmant3 
    fi
done
)

I get the following error though:

Code:
/home/tools/rwsbackup.sh: line 55: syntax error near unexpected token `done'
/home/tools/rwsbackup.sh: line 55: `done'

Login or Register to Ask a Question

Previous Thread | Next Thread

8 More Discussions You Might Find Interesting

1. Linux

How to transfer files...

Hi guys, ok so, how do you go about networking between Windows and Linux so that I can transfer files between each other? (5 Replies)
Discussion started by: billcrosby
5 Replies

2. Shell Programming and Scripting

Copy files in thumbnail folder to a secondary location for Amazon S3

Hello all! I am trying to create a script that will copy files from one location, to another but only folders that are filled with thumbnails to an exact directory replica in the second location. For example: /images/2012/01/19/Event/Photographer/thumbnails to ... (4 Replies)
Discussion started by: Buzzman25
4 Replies

3. Shell Programming and Scripting

Transfer files with web based form by date

Not sure how I should approach this one. I have server X and Server Y. X is a collector. All files are seperated by hour. Y is used strictly for analysis. I do not always need all files from X. Sometimes other people use Y for analysis and do not always know how to transfer the files from... (3 Replies)
Discussion started by: mrlayance
3 Replies

4. Virtualization and Cloud Computing

CEP as a Service (CEPaaS) with MapReduce on Amazon EC2 and Amazon S3

Tim Bass 11-25-2008 01:02 PM Just as I was starting to worry that complex event processing community has been captured by RDBMS pirates off the coast of Somalia, I rediscovered a new core blackboard architecture component, Hadoop. Hadoop is a framework for building applications on large... (0 Replies)
Discussion started by: Linux Bot
0 Replies

5. Shell Programming and Scripting

Transfer Of Files

How to transfer the files in windows server to the unix server by using the unix or ftp commands? (1 Reply)
Discussion started by: vinay123
1 Replies

6. UNIX for Dummies Questions & Answers

How to transfer files

please help me to transfer files from one server to another one i am having problem in it thanks (1 Reply)
Discussion started by: pankaj001np
1 Replies

7. UNIX for Dummies Questions & Answers

How to transfer files

(0 Replies)
Discussion started by: spoonman
0 Replies

8. UNIX for Advanced & Expert Users

How do i transfer files

Im trying to transfers a file from one unix server to another , make some changes and then send it back to the original server. All this using modems. I've been using "cu" and i can "get" the file but i can't "put" it. Besides I need to do this using a shell script. I can write a script to get... (4 Replies)
Discussion started by: phsoft
4 Replies
Login or Register to Ask a Question