Using rsync --link-dest pointing to a remote server


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Using rsync --link-dest pointing to a remote server
# 1  
Old 06-21-2013
Using rsync --link-dest pointing to a remote server

Ok, I'm trying to figure out how to reference a remote file using the rsync --link-dest parameter.

Here is the code I have so far:

Code:
#
# FILESERVER INCREMENTAL BACKUP SCRIPT
#

# Remove previous log file

rm /usr/local/bin/rsync-incremental.log

# Set daily variables

DAY0=`date -I`
DAY1=`date -I -d "1 day ago"`
DAY6=`date -I -d "2 days ago"`
DAY7=`date -I -d "3 days ago"`
DAY8=`date -I -d "4 days ago"`

# Set rsync variables

SRC="/usr/smbshares"
TRG="root@synology1:/volume1/Fileserver/$DAY0"
LNK="root@synology1:/volume1/Fileserver/$DAY1"
OPT="-avhz -e ssh --log-file=/usr/local/bin/rsync-incremental.log --delete --link-dest=$LNK"

# Remove old backups

ssh root@synology1 "if [[ -d /volume1/Fileserver/$DAY6 ]]; then nohup rm -rf "/volume1/Fileserver/$DAY6"; fi"

ssh root@synology1 "if [[ -d /volume1/Fileserver/$DAY7 ]]; then nohup rm -rf "/volume1/Fileserver/$DAY7"; fi"

ssh root@synology1 "if [[ -d /volume1/Fileserver/$DAY8 ]]; then nohup rm -rf "/volume1/Fileserver/$DAY8"; fi"

# Perform the rsync

rsync $OPT $SRC $TRG

Everything is working now except the --link-dest parameter. I get the following error at the beginning of execution:

--link-dest arg does not exist: root@synology1:/volume1/Fileserver/2013-06-20

What's funny is the rsync has no issue using the same syntax for the target folder of the command...

Anyone know how to reference the remote file for this?

Thanks!

---------- Post updated at 01:52 PM ---------- Previous update was at 01:31 PM ----------

If anyone is wondering, I'm trying to accomplish backing up a SLES Samba server to a remote Synology NAS. I want to keep 5 days worth of backups on the NAS, but only the original or changed files. Not backing up ALL files all five days.

If a backup folder exists that is 6, 7 or 8 days old, I want to remove it to keep files from accumulating on the NAS over time.

I know the $DAY6, etc. variables are wrong as I changed them for testing purposes.

So that's what I'm trying to accomplish. If you can think of any better way to accomplish this, I'd be glad to give that a shot as well, but from what I've read it seems rsync is pretty reliable and quick.

As it's working right now, I am getting the 5 days worth of backups, but it's copying every file every time as the --link-dest isn't working.
# 2  
Old 06-21-2013
That option is for two subtrees on the same device locally. It makes a clone tree with the same inode present in the same relative directory of both ssubtrees. Hard link means same device of same machine.
# 3  
Old 06-21-2013
Ok, so is there another solution?

---------- Post updated at 06:08 PM ---------- Previous update was at 05:56 PM ----------

I may have found something. I'll post any results.

---------- Post updated at 07:59 PM ---------- Previous update was at 06:08 PM ----------

I am testing the following code:

Code:
#
# FILESERVER INCREMENTAL BACKUP SCRIPT
#

# Remove previous log file

rm /usr/local/bin/rsync-incremental.log

# Set variables

BACKUPDIR=`date +%A`
SRC="/usr/smbshares/"
TRG="root@synology1:/volume1/Fileserver/Current"
OPT="-avhz -e ssh --log-file=/usr/local/bin/rsync-incremental.log --delete --backup --backup-dir=$BACKUPDIR"

# Remove old backups

rsync --delete -a /usr/local/bin/emptydir/ root@synology1:/volume1/Fileserver/$BACKUPDIR/

# Perform the rsync

rsync $OPT $SRC $TRG

Will let you know how things turn out over the weekend...
# 4  
Old 06-24-2013
There is no doubt that rsync can make a remote copy, and can be set to maintain it in real time, too. Make a little subtree to play with, and get the feel of it. You can option it for security using ssh, and for compression using ssh -C, which sometimes helps lighten net load and speed things up.

If you want to keep some dialy states, you can make a clone tree (same subdirectories, all files hard linked from source subtree) remotely to back up to if different, so all unchanged files are stored in one inode for all consecutive versions. Users love to be able to ge a file as it way N days ago, or peruse the back versions. Just option rsync to delete before overwrite, as a write to a hard linked file updates all versions.

There may be tools that run on top of rsync or use similar methods and manage this sort of backup for you. In the Internet Age, you have to imagine and Google.
# 5  
Old 06-25-2013
Ok, thanks for that additional info.

However, looking at the script, I'm thinking this should definitely work, but for some reason, nothing is ever getting copied into the backup directory. It creates the backup directory, but nothing is ever placed into it.

Here is the code I'm running:

Code:
#
# FILESERVER INCREMENTAL BACKUP SCRIPT
#

# Remove previous log file

rm /usr/local/bin/rsync-incremental.log

# Set variables

BACKUPDIR=`date +%A`
SRC="/usr/smbshares/"
TRG="root@synology1:/volume1/Fileserver/Current/"
OPT="-avhz -e ssh --log-file=/usr/local/bin/rsync-incremental.log --delete --backup --backup-dir=$BACKUPDIR"

echo BACKUPDIR > /usr/local/bin/rsync-incremental.log
echo $BACKUPDIR >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log

# Remove old backups

rsync --delete -a /usr/local/bin/emptydir/ root@synology1:/volume1/Fileserver/$BACKUPDIR

echo SOURCE >> /usr/local/bin/rsync-incremental.log
echo $SRC >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log
echo TARGET >> /usr/local/bin/rsync-incremental.log
echo $TRG >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log
echo OPT >> /usr/local/bin/rsync-incremental.log
echo $OPT >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log

# Perform the rsync

rsync $OPT $SRC $TRG

Here is the top of the rsync-incremental.log file with the variables spelled out:

BACKUPDIR
Tuesday
------------------
SOURCE
/usr/smbshares/
------------------
TARGET
root@synology1:/volume1/Fileserver/Current/
------------------
OPT
-avhz -e ssh --log-file=/usr/local/bin/rsync-incremental.log --delete --backup --backup-dir=Tuesday
------------------


So the rsync line passed is:

Code:
rsync -avhz -e ssh --log-file=/usr/local/bin/rsync-incremental.log --delete --backup --backup-dir=Tuesday /usr/smbshares/ root@synology1:/volume1/Fileserver/Current/

But, as I said, nothing is ever copied into the Tuesday folder! It is created, but nothing is under it. Also, there are definite changes made to SEVERAL HUNDRED files per day under /usr/smbshares, I can assure you.

Now, to test, I took one of the subdirectories under /usr/smbshares and typed the following line:

Code:
rsync -avhz -e ssh --delete --backup --backup-dir=darrellpbackup /usr/smbshares/darrellp root@synology1:/volume1/Fileserver/

I then went and made a small change to one of the files and performed again and it backed up the previous copy of the file in the backup folder!

Can you guys see anything I'm missing here?

---------- Post updated at 10:45 AM ---------- Previous update was at 09:53 AM ----------

Ok, I've figured this out somewhat.

It is creating a folder under /volume1/Fileserver/Tuesday, but it is ALSO creating a folder under /volume1/Fileserver/Current/Tuesday.

The second is where the backups are being placed, but the first is the empty (and that's where I've been looking for the backup files).

Any idea based on the rsync command why it would create the 1st folder?

Thanks!

---------- Post updated at 01:09 PM ---------- Previous update was at 10:45 AM ----------

Ok, after fiddling some more, I've changed my code to:

Code:
#
# FILESERVER INCREMENTAL BACKUP SCRIPT
#

# Remove previous log file

rm /usr/local/bin/rsync-incremental.log

# Set variables

BACKUPDIR=`date +%A`
SRC="/usr/smbshares/"
TRG="root@synology1:/volume1/Fileserver/Current/"
OPT="-avhz -e ssh --log-file=/usr/local/bin/rsync-incremental.log --delete --backup --backup-dir=/volume1/Fileserver/$BACKUPDIR"

echo BACKUPDIR > /usr/local/bin/rsync-incremental.log
echo $BACKUPDIR >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log

# Remove old backups

rsync --delete -a /usr/local/bin/emptydir/ root@synology1:/volume1/Fileserver/$BACKUPDIR

echo SOURCE >> /usr/local/bin/rsync-incremental.log
echo $SRC >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log
echo TARGET >> /usr/local/bin/rsync-incremental.log
echo $TRG >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log
echo OPT >> /usr/local/bin/rsync-incremental.log
echo $OPT >> /usr/local/bin/rsync-incremental.log
echo ------------------ >> /usr/local/bin/rsync-incremental.log

# Perform the rsync

rsync $OPT $SRC $TRG

Seems to be working so far and will let you know after a day or two to make sure things are being backed up to the appropriate folder. What was happening is the "Clear" rsync was going to one folder where the "Backup" rsync was going to the other one. By specifying the "Backup" folder including the path, they now point to the same folder. Just wasn't sure if that would work via the rsync since I was specifying a path on the remote server, but seems to like it!

Last edited by Orionizer; 06-25-2013 at 02:23 PM..
# 6  
Old 06-25-2013
Thanks for the updates!
# 7  
Old 06-25-2013
No problem! Hoping this works and, of course, will let you know one way or the other. I do believe I've nixed the issue though! Smilie

I should also state, sorry for being so rusty. The last time I worked on any serious shell scripts was in college in 1990 or so (on an AT&T UNIX box), so I'm quite out of practice.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Script connect to remote server, not find files and exit only from remote server, but not from scrip

I have a script, which connecting to remote server and first checks, if the files are there by timestamp. If not I want the script exit without error. Below is a code TARFILE=${NAME}.tar TARGZFILE=${NAME}.tar.gz ssh ${DESTSERVNAME} 'cd /export/home/iciprod/download/let/monthly;... (3 Replies)
Discussion started by: digioleg54
3 Replies

2. Shell Programming and Scripting

Sudo connect to a remote server and execute scripts in remote server

Hello Every one!! I am trying to write a shell script which will connect to a remote server and execute scripts which are at a certain path in the remote server. Before this I am using a sudo command to change the user. The place where I am stuck is, I am able to connect to the... (6 Replies)
Discussion started by: masubram
6 Replies

3. UNIX for Dummies Questions & Answers

Help with rsync using link-dest

Hi, I'm trying to keep backups of my home directory (i.e. back it up every day or so) using rsync with link-dest. This is how I do it, rsync -avh --delete --link-dest=/media/disk/oldbackup/ /home/user/ /media/disk/newbackup rm -f /media/disk/oldbackup ln -s newbackup... (1 Reply)
Discussion started by: lost.identity
1 Replies

4. Solaris

rsync - remote connection error

Hi , We have installed rsync in two Solaris boxes, when we try to sync files from one machine to another.. it is giving the following error. ld.so.1: rsync: fatal: libiconv.so.2: open failed: No such file or directory rsync: connection unexpectedly closed (0 bytes received so far) rsync... (1 Reply)
Discussion started by: MVEERA
1 Replies

5. UNIX for Advanced & Expert Users

Using remote rsync, but copy locally?

I'm looking to use rsync to compare remote files and to copy the diff to a local directory, rather than transfer over the net. The net connection is not fast enough to transfer these files (~1.8TB) and I'd like to sneakernet them instead. Possible? (4 Replies)
Discussion started by: dfbills
4 Replies

6. AIX

How to create a sym link pointing a changing name.

Hello A new file is created every day with the date appended to the end of a name. We are using Autosys to run jobs which watches for the file and runs jobs. But Autosys does not have the capability to figure out the current date. I tried creating a symlink like this ln -s... (1 Reply)
Discussion started by: vra5107
1 Replies

7. UNIX for Dummies Questions & Answers

To know the server which the production is pointing to?

Hi, How to know which server(Application or webserver) the production link or url is pointing to? Is there any command to get the server IP address? Thanks in advance. (3 Replies)
Discussion started by: venkatesht
3 Replies

8. UNIX for Dummies Questions & Answers

rsync with --link-dest doesn't create hard links

I have been experimenting with rsync as a scriptable backup option, reading various example and tips pages, including Time Machine for every Unix out there - IMHO That page seems to describe the exact behavior I want: The ability to make a "full backup" snapshot regularly, but with rsync... (0 Replies)
Discussion started by: fitzwilliam
0 Replies

9. Shell Programming and Scripting

How to write a shell script for rsync to remote server?

Hello, Can you help me to write a shell script for rsync a folder from my server to remote server ? i do this in ssh by this command : rsync -avz -e ssh /copy/me/ login@serverip:/where/to i want have a shell script that do this command. and you know that this command ask remote... (0 Replies)
Discussion started by: Desperados
0 Replies

10. Solaris

creating link for a file of remote server

Hi, I use to access a file on remote server. Can I create a link for this file present on remote server on my local server, if yes then please let me know. Thanx in advance. (7 Replies)
Discussion started by: sanjay1979
7 Replies
Login or Register to Ask a Question