I have a list of working directory in a remote computer acesssible through ssh and the same directory structure in my home directory of my laptop. I sometimes work on both my laptop and my this other computer.
I usually use Rsync this way to synchronize files
I run both of these as a shell script
This seems to work fine, but the problem that I face is that if I edit a file's nae in the local Host. or delete a file in the remote Host then when I execute these two commands, I get the old files + the new files in both places.
Thus before doing rsync, I would like to (say use ls and diff) to find out which files are not present in the Local Directory but present in the Remote Directory and which files vice versa.
Then I can check for those files, delete them or not, and then do these rsync commands.
If its possible to automate these things, with some better shell scripting than just simple rsync ?
also I want to know how useful is Crontab for these kinds of work. , or can I make Crontab run when both machines are on at a particular point of time in the day / week ?
crontab is obviously very useful, with two reservations.
Obviously, if the one of the machines is down, cron will not run. You could set up some mechanism to compensate for that (write a timestamp file, and check the file was created), but it would be a bother.
You could make the two machines run a job at the same time, but you would have to make sure the clocks are exactly the same. It's probably better to let one of them do it's think first, and then an hour or whatever later, let the other one. Or might be some way to do everything on one machine, and it runs a remote process on the other.
I was thinking of the Crontab stuff, but it looks a little messy, because one of the machines is my laptop , (which goes to sleep mode , or I shut it down) when I go out, and the other is the office computer.
1) Create a shell script of rsync commands.
So I guess in the situation , it would be best to
2) check filenames in both systems first, and see which files have got changed today at work. For this,
Combination of find with some options must work here I guess.!
ls -ltR would give the entire set of files ordered from time from oldest to recent.
3) Delete appropriate files in both systems which are extra ( like autosave files etc etc)
Here I think find could be useful . but it needs to be run on both machines
You're suggesting to maintain two separate working copies of a bunch of files, and keep them synced. I don't do that. So I'm reluctant to advise on this. If things didn't work correctly, it could be a big mess.
I always have just one working copy of everything, and I always access and work on that one copy. I think it's safer to just have one working copy. I back up that one copy in multiple ways to make sure it is not lost, and that I can retrieve a previous version if needed.
I think the only person who could advise you would be someone who keeps two working copies of everything, as you want to do.
Is there any chance you might just keep the files on the remote computer, and access them from your laptop?
If I have these two working copies that is because some days, I work on my laptop, and some days I work at the office computer.
I was thinking if it is possible to backup to a thrd source (like say an external HD), because today I am not able to access my office computer remotely over ssh.
so should I run it like this
I thiink I am complicating stuff here, but is there a simpler way out ?
Hello All,
I am looking at a fast way to script some backups. I am looking at using rsync to do the leg work. I am having a hard time conceiving a script though.
I have a tree with subfolders within subfolders. I was looking at the /xd option to parse the tree.
Directory of k:\
... (4 Replies)
Hi,
I am trying to use rsync utility through ssh to synchronize some root files of 2 servers.
I have a rsyncusr user in each server. I configured ssh with no password. I set NOPASSWD in the /etc/sudoers file:
rsyncusr ALL= NOPASSWD:/usr/bin/rsync
In order to make rsync able to sudo and be... (2 Replies)
Hello everybody
I'm triing since few days to do this. So sorry if my question looks stupide, but i've tried.
I have to get picture from a folder (who is updated automaticly and with subfolder) with theirs extensions (i'm ok on that) and this files have to me copied in a folder where a website... (7 Replies)
Hi Everyone,
we are running rsync with --backup mode, Are there any rsync options to remove backup folders on successful deployment?
Thanks in adv. (0 Replies)
I'm using this script to backup an external hdd to another external hdd -
rsync -aE --delete --exclude Volumes/Disk\ A/.Trashes "/Volumes/Disk A" "/Volumes/Disk A Backup"The source drive being "Disk A", and the drive I wish to backup to being "Disk A Backup".
I'm constantly getting this error,... (4 Replies)
How do i use Rsync yo pickup only new or modified files from source?
I am using rsync -ravzpotu --delete-excluded but sometimes it goes thru all files again (5 Replies)
I've got a new MythTV box at home and figured it would be a great opportunity to use it to do daily mirrors of my mysqlbackup directory (let's say /mysqlbackup/backups) and my website at /usr/local/apache/htdocs and below.
I figured it would be a best practice NOT to use a root login but to... (0 Replies)
hello,
i need to modified my synch/back scripts....
i want that this script only syncro folders in destinationfolder.
f.e. when in destination are two folders
1) admin
2) users
but in SOURCE are three:
1) admin
2) users
3) antivirus
the script should only increnmential sync the... (0 Replies)
I want to take daily backup(11pm) of /var/www to /mnt/bak excluding /var/www/videos and /var/www/old. HOW to implement a rotating snapshot method, so that i can have multiple(say 4) automatically rotating backups. (0 Replies)