Sponsored Content
Full Discussion: FTP a huge Size file
Top Forums Shell Programming and Scripting FTP a huge Size file Post 302548002 by h@foorsa.biz on Wednesday 17th of August 2011 08:23:22 AM
Old 08-17-2011
FTP is doing well in individual large files .
What about rsync is a good option.
Just google for FTP vs rsync
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

ftp file size

how to compare file size which has been received through ftp get from a remote location with local copy available any clue regards (5 Replies)
Discussion started by: sathiya
5 Replies

2. UNIX for Dummies Questions & Answers

ftp hangs on file size = 0

I have an ftp process which runs every 10 minutes between Unix and an NT box. Normally it works, but when the script tries to get a file from NT that has a length of 0, the ftp process hangs (as if it is still waiting for the end of the file). This is the script... (3 Replies)
Discussion started by: mheinrich
3 Replies

3. Shell Programming and Scripting

How to compare file size after ftp?

Is possible if I want to campare file size on source and destination after ftp transfer? If anybody know, please explain to me. (1 Reply)
Discussion started by: icemania
1 Replies

4. UNIX for Advanced & Expert Users

Need to synchronize filesystems with huge size

Hello, I am using IXOS 3rd party utility to maintain invoices (images) in my client's SAP system. We have a DR (Disaster Recovery) setup & for that I need to synchronize my /ixos filesystem (with size more than 50 GB) between two servers say x182 & x050. Initially I thought to tar the... (4 Replies)
Discussion started by: vishal_ranjan
4 Replies

5. Shell Programming and Scripting

Checking the size of a file after FTP

Hi I am doing a FTP process through which I am copying a file from my local server to Remote server. After this I want to check the size of the file Below is my program: LOCALDIR=/batch/ediprocess REMOTESERVER=test.appl.com REMOTEPATH=batch/ftpTest LOGIN=px PASSWORD=abcd ftp -n... (3 Replies)
Discussion started by: shanth_chandra
3 Replies

6. Shell Programming and Scripting

Implement in one line sed or awk having no delimiter and file size is huge

I have file which contains around 5000 lines. The lines are fixed legth but having no delimiter.Each line line contains nearly 3000 characters. I want to delete the lines a> if it starts with 1 and if 576th postion is a digit i,e 0-9 or b> if it starts with 0 or 9(i,e header and footer) ... (4 Replies)
Discussion started by: millan
4 Replies

7. Shell Programming and Scripting

Optimised way for search & replace a value on one line in a very huge file (File Size is 24 GB).

Hi Experts, I had to edit (a particular value) in header line of a very huge file so for that i wanted to search & replace a particular value on a file which was of 24 GB in Size. I managed to do it but it took long time to complete. Can anyone please tell me how can we do it in a optimised... (7 Replies)
Discussion started by: manishkomar007
7 Replies

8. HP-UX

Performance issue with 'grep' command for huge file size

I have 2 files; one file (say, details.txt) contains the details of employees and another file (say, emp.txt) has some selected employee names. I am extracting employee details from details.txt by using emp.txt and the corresponding code is: while read line do emp_name=`echo $line` grep -e... (7 Replies)
Discussion started by: arb_1984
7 Replies

9. AIX

FTP huge file transfer

Hi, I need to transfer 2000 files from one host to another.. I modified /etc/security/limits to -1 and ulimit -f, ulimit -s, ulimit -a.. Even then only 700 files are transferred. Could You please help me to sort out this issue.. I think some configuration related to memory is... (3 Replies)
Discussion started by: Priya Amaresh
3 Replies

10. UNIX for Advanced & Expert Users

FTP zero size file

Hi AM using unix Aix Ksh I need to clarify regarding sending the zero size file to another server. VAL=ftp.sh -c put souce_file Dest_file $1 $2 $3 $4 $5 $VAL 2 > $ERR When am Sending the Zero Size File getting alerts netout write returned Zero. But i would like to know $VAL... (4 Replies)
Discussion started by: Venkatesh1
4 Replies
rsync_selinux(8)					rsync Selinux Policy documentation					  rsync_selinux(8)

NAME
rsync_selinux - Security Enhanced Linux Policy for the rsync daemon DESCRIPTION
Security-Enhanced Linux secures the rsync server via flexible mandatory access control. FILE_CONTEXTS SELinux requires files to have an extended attribute to define the file type. Policy governs the access daemons have to these files. If you want to share files using the rsync daemon, you must label the files and directories public_content_t. So if you created a special directory /var/rsync, you would need to label the directory with the chcon tool. chcon -t public_content_t /var/rsync To make this change permanent (survive a relabel), use the semanage command to add the change to file context configuration: semanage fcontext -a -t public_content_t "/var/rsync(/.*)?" This command adds the following entry to /etc/selinux/POLICYTYPE/contexts/files/file_contexts.local: /var/rsync(/.*)? system_u:object_r:publix_content_t:s0 Run the restorecon command to apply the changes: restorecon -R -v /var/rsync/ SHARING FILES
If you want to share files with multiple domains (Apache, FTP, rsync, Samba), you can set a file context of public_content_t and pub- lic_content_rw_t. These context allow any of the above domains to read the content. If you want a particular domain to write to the pub- lic_content_rw_t domain, you must set the appropriate boolean. allow_DOMAIN_anon_write. So for rsync you would execute: setsebool -P allow_rsync_anon_write=1 BOOLEANS
system-config-selinux is a GUI tool available to customize SELinux policy settings. AUTHOR
This manual page was written by Dan Walsh <dwalsh@redhat.com>. SEE ALSO
selinux(8), rsync(1), chcon(1), setsebool(8), semanage(8) dwalsh@redhat.com 17 Jan 2005 rsync_selinux(8)
All times are GMT -4. The time now is 03:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy