10 More Discussions You Might Find Interesting
Does anyone know if it is possible to tar files larger than 2GB? The reason being is they want me to dump a single file (which is around 20GB) to a tape drive and they will restore it on a Solaris box. I know the tar have a limitation of 2GB so I am thinking of a way how to overcome this.... (11 Replies)
Discussion started by: depam
2. UNIX for Advanced & Expert Users
I am executing a SQL query and the output is more than 2GB. Hence the process is failing. How can I have a file created more than 2GB ?
Risshanth (1 Reply)
Discussion started by: risshanth
I am not able to unzip file greater then 2gb,
Any suggestions how to do that in linux?
Manoj (5 Replies)
Discussion started by: manoj.solaris
I'm trying to compress a certain number of files from a cifs mount to a xfs mount, but cannot do it when the total size of the files is bigger than 2GB.
Is there any limitation for above 2GB?
OS is SLES 64bit
The files are maximum 1MB, so there are aprox. 2000 files to compress... (2 Replies)
Discussion started by: xavix
I am trying to execute a database dump to a file, but can't seem to get around the 2GB file size. I have tried setting the user limit to -1, but no luck. (4 Replies)
Discussion started by: markper
6. UNIX for Advanced & Expert Users
With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
7. Filesystems, Disks and Memory
Any idea how to get around this limit? I have a 42GB database backup file (.dmp) taking up disk space because neither tar nor cpio are able to put it onto a tape. I am on a SUN Solaris using SunOS 5.8. I would appreciate whatever help can be provided. Thanks! (9 Replies)
Discussion started by: SLKRR
8. Shell Programming and Scripting
Currently a backup script copies compressed files to tape using cpio command (on AIX 5.2). Recently we've had a compressed file which has gone over 2 GB in size resulting in an error while copying this file onto the tape using cpio.
Any suggestions on relevant workarounds would be much... (2 Replies)
Discussion started by: dnicky
My C++ program returns 'Disk Full' Message when I tried to manage a file larger than 2Gb. the process is very simple: based on a TXT file, the process combine the information generating another temporary file (generating the error) to fillup a database.
My FS, during the process, reaches 40%...... (4 Replies)
Discussion started by: ASOliveira
10. UNIX for Dummies Questions & Answers
I am working on HP-Unix.
I have a 600 MB file in compressed form.
During decompression, when file size reaches
2GB, decompression aborts.
What should be done? (3 Replies)
Discussion started by: Nadeem Mistry