My client is using perforce. Like CVS, perforce has the feature where you can create your own branch and check broken code into the branch with out breaking the main line build. When you finally get to a point that you are not going to break the build, you merge your branch with mainline.
Unfortunately, they are not using that branch feature here in that way. Maybe we should be but we are not.
The consequence is that I have been working on some files for a month and they are not getting backup up because they are in my perforce sandbox. They need to be checked in (for the purpose of being backed up) but cannot be because I'm not done with them yet (I don't want to break the build!).
I've never used rsync or unison before. Is there some utiltity or set of commands I can use to recursively descend my directory tree looking for files that are not set the "read only" and copy them and their directory paths to a directory that will be backed up?
It seems like the find command could do this, but I'm not sure how to recreate the directory path under some other directory. I would want to run this daily and only copy files that have changed and are read only.
Is there some utiltity or set of commands I can use to recursively descend my directory tree ...
Yes, rsync works recursively (option -r, --recursive).
Quote:
Originally Posted by siegfried
...looking for files that are not set the "read only"
If you mean filter on the permission, I don't think it is possible without first constructing an include FILE (option --include-from=FILE) piped from a find -perm 444 command.
The format of the rsync include file is not easy to understand but once you get the idea you will easily make that include file.
As I found your question quite relevant to the use of rsync, I made a test based on my scenario above.
Not so complicated after all.
This script will first produce an INC_FILE like this:
One could ask why bother using rsync where a simple cp could do the job. Well, rsync is very efficient when copying across a network (check option -–compress). It checks files stats, when it finds a file that needs to be updated, it will split both files, source and target, into small chunks, checksum these blocks and will only copy the chunks that don’t match. It saves bandwidth and time. Very efficient (have a look at the speedup factor given by rsync after the process)
Many more goodies like ssh transfer and the like.
Give it a try and don’t hesitate to use the –-dry-run option, specially if you play with the numerous –-delete options! I once erased my source directory with the lethal --remove-sent-files option.
I wish to copy all the files & folder under /web/Transfer_Files/data/ on mymac1 (Linux) to remote server mybank.intra.com (Solaris 10) /tmp/ location
I am using Ansible tool synchronize module which triggers the unix rsync command as below:rsync --delay-updates -F --compress --archive --rsh=ssh... (2 Replies)
Hi experts,
We need copy 5TB data from one server to another (over a 10Gbps link). We plan to use rsync -av remote:/<path /local on destination server but there're few special requirements like:
1. data copy process should run only from 18:00 Hrs to 07:00 every day until copy is completed. Is... (1 Reply)
hi there I wonder if some-one can help.
I am trying to use rsync on my mac to transfer a folder to a remote machine.
I have logged into rysnc on my mac no problem and I'm trying to execute this command:
rsync -a -e ssh /Users/myname/myfolder/sourcefolder/... (3 Replies)
I'm just trying to use rsync to retreive the file from different servers with script. i want to look for a file, if the file exists, then retreive the file from different servers, and put it in one file. I have the following command.
rsync -v -e ssh jerry@openbsd.nixcraft.in:~/webroot.txt /tmp ... (5 Replies)
i last night i copied a 400GB folder using rsync and ssh i did:
rsync -r /mnt/500_GB ssh miguel@192.168.1.3:/mnt/1500_GB
and it copied the folder fine all 400GB.
The question is: If i put more files to that initial 400GB folder, which command can i run on my server for it to update the... (4 Replies)
Hi guys,
I will be copying data from one FS to another. (only once)
These are local Filesystems...
Which tool would be more efficient to use in this case?
There is a huge amount of data that needs to be copied... (1.5TB)
John (1 Reply)
I was reading the man page for rsync expecting it to talk about synchronization. It says it is like a rcp. I was expecting it to have something to do with synchronization.
I was expecting rsync to only copy those files that are newer and it does not look like this is what it was designed to do.... (4 Replies)
I want to do rsync only for the difference in the last 30 days. How do I specify the "30 days" on the command line below?
>rsync -avz prj# /rsource /destination
Thanks for help (6 Replies)