Unix/Linux Go Back    


Shell Programming and Scripting BSD, Linux, and UNIX shell scripting — Post awk, bash, csh, ksh, perl, php, python, sed, sh, shell scripts, and other shell scripting languages questions here.

learn linux and unix commands - unix shell scripting

Is this possible with SCP?

Shell Programming and Scripting


Closed    
 
Thread Tools Search this Thread Display Modes
    #1  
Old Unix and Linux 02-04-2009   -   Original Discussion by cpabrego
cpabrego's Unix or Linux Image
cpabrego cpabrego is offline
Registered User
 
Join Date: Jan 2009
Last Activity: 5 May 2011, 4:14 PM EDT
Posts: 13
Thanks: 0
Thanked 0 Times in 0 Posts
Is this possible with SCP?

I normally download a directory recursively using:

scp -r <name>@host:<path> .

This has worked fine. As everyone knows this will download all of the directory named in <path> and all of the sub directories.

I would like to know if it is possible to not download a particular file if it is in any part of the directory tree.

I would like to do this because there is a file in every subdirectory that contains a file (always the same name) which I do not need because it is very large, binary, and not very useful for me. So I would like to essentially skip this file and not waste time downloading it because there may be 100's of directories.

If this is not possible with scp, does anyone know how I might accomplish this requirement?

Thank you!
Sponsored Links
    #2  
Old Unix and Linux 02-04-2009   -   Original Discussion by cpabrego
pludi's Unix or Linux Image
pludi pludi is offline Forum Advisor  
Cat herder
 
Join Date: Dec 2008
Last Activity: 28 March 2014, 8:35 AM EDT
Location: Vienna, Austria, Earth
Posts: 5,521
Thanks: 38
Thanked 335 Times in 308 Posts
With scp alone, probably not. But you can try this ugly little thing

Code:
$ ssh <name>@host 'tar -cf - `find <path> ! -name exclude_me -print`' | tar -xf -

It's even possible to pipe it through compress/gzip/bzip on both sides as to save on bandwith.
Sponsored Links
    #3  
Old Unix and Linux 02-05-2009   -   Original Discussion by cpabrego
cpabrego's Unix or Linux Image
cpabrego cpabrego is offline
Registered User
 
Join Date: Jan 2009
Last Activity: 5 May 2011, 4:14 PM EDT
Posts: 13
Thanks: 0
Thanked 0 Times in 0 Posts
Quote:
Originally Posted by pludi View Post
With scp alone, probably not. But you can try this ugly little thing

Code:
$ ssh <name>@host 'tar -cf - `find <path> ! -name exclude_me -print`' | tar -xf -

It's even possible to pipe it through compress/gzip/bzip on both sides as to save on bandwith.

Thanks for the input. I have tested the command and this it what it does.

It downloads all directories (e.g.,/dir1/dir2/dir3/dirofinterest), it does not download any of the files in dir1 dir2 or dir3 which is good.

However, in dirofinterest there are more (around 40) directories which contain a commonly named file Ill call (LEAVE) which I would like to exclude because I do not need them and they are usually at least 20MB.

This is what I typed:


Code:
$ ssh <name>@host 'tar -cf - `find /dir1/dir2/dir3/dirofinterest ! -name LEAVE -print`' | tar -xf -

Since I am new it is possible that I misinterpreted your instructions. let me know what you think.
    #4  
Old Unix and Linux 02-05-2009   -   Original Discussion by cpabrego
cpabrego's Unix or Linux Image
cpabrego cpabrego is offline
Registered User
 
Join Date: Jan 2009
Last Activity: 5 May 2011, 4:14 PM EDT
Posts: 13
Thanks: 0
Thanked 0 Times in 0 Posts
Solution

This solution works for me:



Code:
tar czvf  <tar file name>  <root folder or list of each folder> --exclude <filetoexclude>



use scp to download the file compressed files.

Then unpack...


Code:
tar xzvf <tar filename>

Sponsored Links
    #5  
Old Unix and Linux 02-06-2009   -   Original Discussion by cpabrego
lemac's Unix or Linux Image
lemac lemac is offline
Registered User
 
Join Date: Feb 2009
Last Activity: 2 May 2009, 4:46 PM EDT
Posts: 6
Thanks: 0
Thanked 0 Times in 0 Posts
also you could use rsync with exclude pattern :-)
Sponsored Links
    #6  
Old Unix and Linux 02-07-2009   -   Original Discussion by cpabrego
ddreggors's Unix or Linux Image
ddreggors ddreggors is offline
Registered User
 
Join Date: Aug 2008
Last Activity: 22 July 2013, 9:20 AM EDT
Location: Florida
Posts: 160
Thanks: 5
Thanked 12 Times in 11 Posts
I use this all the time, this is even gzipped (compressed for faster transfer).



Code:
tar czf - </path/to/file_or_folder> --exclude LEAVE | ssh <user@host> tar xzf - -C </path/to/copy_to>

Note:
If you are familiar with the tar command then you know that if you start at say "/" and tar the path "/var/www/html" then when you un-tar you will have the full path in that tar file (/var/www/html). If you wish to avoid this, the cd to "/var/www/html" and "tar ./*" so that you do NOT have full path recursion.

Last edited by ddreggors; 02-07-2009 at 03:40 AM..
Sponsored Links
Closed




All times are GMT -4. The time now is 08:39 PM.