Sponsored Content
Full Discussion: largefile (>2gb) problem
Top Forums UNIX for Dummies Questions & Answers largefile (>2gb) problem Post 13492 by roydv on Friday 18th of January 2002 12:20:12 AM
Old 01-18-2002
What about removing the vxfs file system and instead use ufs. ufs also supports largefiles.

UNIX srv4 already supported largefile support. is it then wrong to assume that unixware 7 would have an older than ver3.2 Veritas file system? unixware 7 should support largefiles.

is there a unix command that can be used to check the version of the installed veritas file system?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

File size exceeding 2GB

I am working on HP-Unix. I have a 600 MB file in compressed form. During decompression, when file size reaches 2GB, decompression aborts. What should be done? (3 Replies)
Discussion started by: Nadeem Mistry
3 Replies

2. Programming

C++ Problem, managing >2Gb file

My C++ program returns 'Disk Full' Message when I tried to manage a file larger than 2Gb. the process is very simple: based on a TXT file, the process combine the information generating another temporary file (generating the error) to fillup a database. My FS, during the process, reaches 40%...... (4 Replies)
Discussion started by: ASOliveira
4 Replies

3. Shell Programming and Scripting

cpio - files > 2gb

Hi, Currently a backup script copies compressed files to tape using cpio command (on AIX 5.2). Recently we've had a compressed file which has gone over 2 GB in size resulting in an error while copying this file onto the tape using cpio. Any suggestions on relevant workarounds would be much... (2 Replies)
Discussion started by: dnicky
2 Replies

4. Filesystems, Disks and Memory

tar 2GB limit

Any idea how to get around this limit? I have a 42GB database backup file (.dmp) taking up disk space because neither tar nor cpio are able to put it onto a tape. I am on a SUN Solaris using SunOS 5.8. I would appreciate whatever help can be provided. Thanks! (9 Replies)
Discussion started by: SLKRR
9 Replies

5. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

6. AIX

Creating > 2GB file

I am trying to execute a database dump to a file, but can't seem to get around the 2GB file size. I have tried setting the user limit to -1, but no luck. (4 Replies)
Discussion started by: markper
4 Replies

7. Linux

Compress files >2GB

Hi folks, I'm trying to compress a certain number of files from a cifs mount to a xfs mount, but cannot do it when the total size of the files is bigger than 2GB. Is there any limitation for above 2GB? OS is SLES 64bit The files are maximum 1MB, so there are aprox. 2000 files to compress... (2 Replies)
Discussion started by: xavix
2 Replies

8. Linux

unzipping file > 2gb

I am not able to unzip file greater then 2gb, Any suggestions how to do that in linux? Regards, Manoj (5 Replies)
Discussion started by: manoj.solaris
5 Replies

9. UNIX for Advanced & Expert Users

How to create a file more than 2GB

Hi, I am executing a SQL query and the output is more than 2GB. Hence the process is failing. How can I have a file created more than 2GB ? Thanks, Risshanth (1 Reply)
Discussion started by: risshanth
1 Replies

10. AIX

Tar files larger than 2GB

Hi, Does anyone know if it is possible to tar files larger than 2GB? The reason being is they want me to dump a single file (which is around 20GB) to a tape drive and they will restore it on a Solaris box. I know the tar have a limitation of 2GB so I am thinking of a way how to overcome this.... (11 Replies)
Discussion started by: depam
11 Replies
All times are GMT -4. The time now is 06:02 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy