largefile (>2gb) problem


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers largefile (>2gb) problem
# 1  
Old 01-17-2002
largefile (>2gb) problem

i'm working on SCO unix 7 and i'm trying to be able to creat files greater than 2GB.

i've done the following and still haven't soved the problem:
1) fsadm -F vxfs -o largefiles [mountpoint]
2) ulimit -f, -d, -s, -c unlimited
3) scoadmin > system tuning > Defaults manager > vxfs defaults: stting maxsize to approx 12GB
4) scoadmin > system tuning > Process Limit Parameters: changed values of SFSZLIM and HFZSLIM to 0x7FFFFFFF

i still get the error "file size limit exceeded" when i try to cat two 2GB files to one 4GB file.

what the heel is the problem?????
# 2  
Old 01-17-2002
It may be something to do with the following:
(Check your version of VxFS - found at http://docsrv.caldera.com/ )

VxFS 3.2 contains new features that are incompatible with earlier versions of some operating systems and with old applications. These features are large files (file sizes greater than 2 Gbyte), and hierarchical storage management via the DMAPI (Data Management Applications Programming Interface).

Large files are available only with the Version 4 disk layout, available in VxFS 3.2 and above, so an older operating system running a previous version of VxFS would never be exposed to them (the file system mount would fail). But many existing applications will break if confronted with large files, so a compatibility flag is provided that allows or prevents the creation of large files on the file system. If the largefile compatibility flag is set, large files may be created on the file system. If it is not set, any attempt to create a large file on the file system will fail. If the largefiles flag is set on a file system, files can be created that are larger than 2 Gbytes in size.

An attempt to set the flag via the -o largefiles option will succeed only if the file system has the Version 4 disk layout (see the vxupgrade(1M) manual page to upgrade a file system from the Version 1 or later disk layout to the Version 4 disk layout). An attempt to clear the flag via the -o nolargefiles option will succeed only if the flag is set and there are no large files present on the file system (see mount_vxfs(1M).



--------------------------------------------------------------------------------
NOTE: Changing the largefile compatibility flag may require changes to /etc/vfstab. For example, if fsadm is used to set the largefile compatibility flag, but nolargefiles is specified as a mount option in /etc/vfstab, the filesystem will not be mountable.
--------------------------------------------------------------------------------
The -o largefiles and -o nolargefiles options are the only fsadm options that can be used on an unmounted file system. An unmounted file system can be specified by invoking fsadm with a special device rather than a mount point. If an unmounted file system is specified, it must be clean.
thehoghunter
# 3  
Old 01-18-2002
What about removing the vxfs file system and instead use ufs. ufs also supports largefiles.

UNIX srv4 already supported largefile support. is it then wrong to assume that unixware 7 would have an older than ver3.2 Veritas file system? unixware 7 should support largefiles.

is there a unix command that can be used to check the version of the installed veritas file system?
# 4  
Old 01-18-2002
i think i've managed to determine what the problem is:

- I used the "ulimit -f unlimited" command;
- when i type "ulimit -a" the output gives filesize as "unlimited"
- but when i use ulimit without prameters, it gives me output of 419430, that eqauls 2Gb!

what's the problem???
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. AIX

Tar files larger than 2GB

Hi, Does anyone know if it is possible to tar files larger than 2GB? The reason being is they want me to dump a single file (which is around 20GB) to a tape drive and they will restore it on a Solaris box. I know the tar have a limitation of 2GB so I am thinking of a way how to overcome this.... (11 Replies)
Discussion started by: depam
11 Replies

2. UNIX for Advanced & Expert Users

How to create a file more than 2GB

Hi, I am executing a SQL query and the output is more than 2GB. Hence the process is failing. How can I have a file created more than 2GB ? Thanks, Risshanth (1 Reply)
Discussion started by: risshanth
1 Replies

3. Linux

unzipping file > 2gb

I am not able to unzip file greater then 2gb, Any suggestions how to do that in linux? Regards, Manoj (5 Replies)
Discussion started by: manoj.solaris
5 Replies

4. Linux

Compress files >2GB

Hi folks, I'm trying to compress a certain number of files from a cifs mount to a xfs mount, but cannot do it when the total size of the files is bigger than 2GB. Is there any limitation for above 2GB? OS is SLES 64bit The files are maximum 1MB, so there are aprox. 2000 files to compress... (2 Replies)
Discussion started by: xavix
2 Replies

5. AIX

Creating > 2GB file

I am trying to execute a database dump to a file, but can't seem to get around the 2GB file size. I have tried setting the user limit to -1, but no luck. (4 Replies)
Discussion started by: markper
4 Replies

6. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

7. Filesystems, Disks and Memory

tar 2GB limit

Any idea how to get around this limit? I have a 42GB database backup file (.dmp) taking up disk space because neither tar nor cpio are able to put it onto a tape. I am on a SUN Solaris using SunOS 5.8. I would appreciate whatever help can be provided. Thanks! (9 Replies)
Discussion started by: SLKRR
9 Replies

8. Shell Programming and Scripting

cpio - files > 2gb

Hi, Currently a backup script copies compressed files to tape using cpio command (on AIX 5.2). Recently we've had a compressed file which has gone over 2 GB in size resulting in an error while copying this file onto the tape using cpio. Any suggestions on relevant workarounds would be much... (2 Replies)
Discussion started by: dnicky
2 Replies

9. Programming

C++ Problem, managing >2Gb file

My C++ program returns 'Disk Full' Message when I tried to manage a file larger than 2Gb. the process is very simple: based on a TXT file, the process combine the information generating another temporary file (generating the error) to fillup a database. My FS, during the process, reaches 40%...... (4 Replies)
Discussion started by: ASOliveira
4 Replies

10. UNIX for Dummies Questions & Answers

File size exceeding 2GB

I am working on HP-Unix. I have a 600 MB file in compressed form. During decompression, when file size reaches 2GB, decompression aborts. What should be done? (3 Replies)
Discussion started by: Nadeem Mistry
3 Replies
Login or Register to Ask a Question