Compress files >2GB


 
Thread Tools Search this Thread
Operating Systems Linux Compress files >2GB
# 1  
Old 04-03-2009
Compress files >2GB

Hi folks,

I'm trying to compress a certain number of files from a cifs mount to a xfs mount, but cannot do it when the total size of the files is bigger than 2GB.
Is there any limitation for above 2GB?

OS is SLES 64bit
The files are maximum 1MB, so there are aprox. 2000 files to compress
The cifs is in a Windows 2003 R2 Enterprise server
The xfs is a 1TB logical volume from a local raid with enough free space

I'm trying the following:

Code:
myfunction () {
find $ORIG -type f -mtime +30 -print > /tmp/tempfile.$$
tar -cvj `cat /tmp/tempfile.$$` | split -b 40m -d - $DEST/$TARFILE
rm `cat /tmp/tempfile.$$`
rm /tmp/tempfile.$$
}

I tried using tar, cpio, pax, gnutar and always I get the same output error:
tar: List of arguments is too large
pax: List of arguments is too large

I also tried with gzip, without compression, and without split...

When the number or size of the files to be compressed is below 2GB or below 2000 files, it works perfectly.

Any clues on that? Smilie
# 2  
Old 04-03-2009
Try it with xargs:

Code:
find ... | xargs tar ....

Regards
# 3  
Old 04-03-2009
It worked!

Now I move the files as:
Code:
find $ORIG -type f -mtime +30 -print | xargs tar cvjf $DEST/$TARFILE | rm

Thanks!
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. AIX

Tar files larger than 2GB

Hi, Does anyone know if it is possible to tar files larger than 2GB? The reason being is they want me to dump a single file (which is around 20GB) to a tape drive and they will restore it on a Solaris box. I know the tar have a limitation of 2GB so I am thinking of a way how to overcome this.... (11 Replies)
Discussion started by: depam
11 Replies

2. UNIX for Dummies Questions & Answers

Issue: Compress in unix server and FTP to windows and open the compress file using Winzip

Hi All ! We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows. I have used the utility ZIP like the below. zip -e <newfilename> df2_test_extract.dat but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Discussion started by: sakthifire
4 Replies

3. Shell Programming and Scripting

compress files

Could someone give me an idea how to compress all files from a given directory that are not of type .z (compressed). Please help. (2 Replies)
Discussion started by: lesstjm
2 Replies

4. Filesystems, Disks and Memory

Compress files on NAS

Hello, I am having difficulty compressing the files using compress or GZIP utility on NAS share NFS mounted on my linux server. Any one have idea on how to do this ? I get the followign error but the trying to compress the files STRP2> compress STR_OUTBOUND_CDM_LOG_LOB.PRT2008_26.txt... (0 Replies)
Discussion started by: kamathg
0 Replies

5. UNIX for Dummies Questions & Answers

how to compress html files

Hello, On a Centos 5.0 server, Apache 2.2 delivers static html page. How could I compress those html pages to gain speed and save bandwidth? is there a utility that would be effective and save? Thanks (2 Replies)
Discussion started by: JCR
2 Replies

6. UNIX for Dummies Questions & Answers

Compress files

Hi All, I would like to archive some of the scripts below(USFINUM042006_01.CSV USFINUM042006_02.CSV and USFINUM042006_03.CSV )and also use a wildcard e.g. <command> USFINUM*.CSV. Also there are a lot of similar files but I want only the three latest files to be compressed. Which is the best... (3 Replies)
Discussion started by: indira
3 Replies

7. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

8. Shell Programming and Scripting

cpio - files > 2gb

Hi, Currently a backup script copies compressed files to tape using cpio command (on AIX 5.2). Recently we've had a compressed file which has gone over 2 GB in size resulting in an error while copying this file onto the tape using cpio. Any suggestions on relevant workarounds would be much... (2 Replies)
Discussion started by: dnicky
2 Replies

9. UNIX for Dummies Questions & Answers

How to Compress log files?

Hi, I have my log files in /home/user1/temp2/logs i want to archive *.log and *.txt files and to store in my /home/user1/temp2/archved/ with *.log with Time stamp ,Please let me know how to do this? (1 Reply)
Discussion started by: redlotus72
1 Replies

10. Filesystems, Disks and Memory

Use of unzip with content files > 2Gb

I am zipping and downloading zip files from an AS400 using the unzip utility. The files are being downloaded onto a Solaris box. Some of the content files in the zip are larger than 2GB. When using the unzip utility (version 5.32), it complains of 'disk full'. The disk is not full, I still have... (2 Replies)
Discussion started by: tcarlson
2 Replies
Login or Register to Ask a Question