Error while copying huge amount of data in aix


Login or Register to Reply

 
Thread Tools Search this Thread
# 1  
Error while copying huge amount of data in aix

Hi

When i copy 300GB of data from one filesystem to the other filesystem in AIX I get the error :

tar: 0511-825 The file 'SAPBRD.dat' is too large.

The command I used is :

# tar -cf - . | (cd /sapbackup ; tar -xf - )

im copying as root

The below is my ulimit -a output :

time(seconds) unlimited
file(blocks) unlimited
data(kbytes) unlimited
stack(kbytes) 4194304
memory(kbytes) unlimited
coredump(blocks) unlimited
nofiles(descriptors) 2000
threads(per process) unlimited
processes(per user) unlimited

Please assist
# 3  
Is your tar version a 64bit version ?
Code:
file tar

?
Is your OS 64bit ?
# 4  
Many old implementations of tar have an 8-gigabyte limit for individual files. If you have or can get GNU tar, it doesn't have this limit. Neither does pax, the POSIX standard modern archiver, but that has an odd cpio-like syntax.

---------- Post updated at 02:34 PM ---------- Previous update was at 02:33 PM ----------

Quote:
Originally Posted by bartus11
Try cpio instead:
Code:
find . | cpio -pmdu /sapbackup

How will cpio help? Unlike tar, it was never modernized to support >8GB.
This User Gave Thanks to Corona688 For This Post:
Login or Register to Reply

|
Thread Tools Search this Thread
Search this Thread:
Advanced Search

More UNIX and Linux Forum Topics You Might Find Helpful
How to make awk command faster for large amount of data?
brenoasrm
I have nginx web server logs with all requests that were made and I'm filtering them by date and time. Each line has the following structure: 127.0.0.1 - xyz.com GET 123.ts HTTP/1.1 (200) 0.000 s 3182 CoreMedia/1.0.0.15F79 (iPhone; U; CPU OS 11_4 like Mac OS X; pt_br) These text files are...... Shell Programming and Scripting
21
Shell Programming and Scripting
Error while copying file on AIX
Vishal_dba
/oragriddb_01/app/oracle/product/11203> ct_14_2012_22_58_58/files/lib/libdbcfg11.so /oragrid_01/Grid_11203/lib/libdbcfg11.so < cp: /oragrid_01/Grid_11203/lib/libdbcfg11.so: Cannot open or remove a file containing a running program. Best regards, Vishal... AIX
2
AIX
Perl : Large amount of data put into an array
sumguy
This basic code works. I have a very long list, almost 10000 lines that I am building into the array. Each line has either 2 or 3 fields as shown in the code snippit. The array elements are static (for a few reasons that out of scope of this question) the list has to be "built in". It...... Shell Programming and Scripting
5
Shell Programming and Scripting
search a number in very very huge amount of data
vsachan
Hi, I have to search a number in a very long listing of files.the total size of the files in which I have to search is 10 Tera Bytes. How to search a number in such a huge amount of data effectively.I used fgrep but it is taking many hours to search. Is there any other feasible solution to...... Shell Programming and Scripting
3
Shell Programming and Scripting
Perl script error to split huge data one by one.
patrick87
Below is my perl script: #!/usr/bin/perl open(FILE,"$ARGV") or die "$!"; @DATA = <FILE>; close FILE; $join = join("",@DATA); @array = split( ">",$join); for($i=0;$i<=scalar(@array);$i++){ system ("/home/bin/./program_name_count_length MULTI_sequence_DATA_FILE -d...... Shell Programming and Scripting
5
Shell Programming and Scripting