Sponsored Content
Full Discussion: File size exceeding 2GB
Top Forums UNIX for Dummies Questions & Answers File size exceeding 2GB Post 4659 by mod on Monday 30th of July 2001 06:53:32 AM
Old 07-30-2001
If changing the filesystem to "largefiles" doesn't work than try to upgrade your gzip-programm to a newer version ... especially gzip from 10.20 doesn't support files >2.0GB ... you can download a never version (also a newer tar-version ...) from

http://hpux.cs.utah.edu/

 

10 More Discussions You Might Find Interesting

1. Programming

C++ Problem, managing >2Gb file

My C++ program returns 'Disk Full' Message when I tried to manage a file larger than 2Gb. the process is very simple: based on a TXT file, the process combine the information generating another temporary file (generating the error) to fillup a database. My FS, during the process, reaches 40%...... (4 Replies)
Discussion started by: ASOliveira
4 Replies

2. Solaris

SUN Solaris 9 - Is there a 2GB file size limit?

Hi I am using SUN/Solaris 9 and I was told that some unix versions have 2GB size limit. Does this applies to SUN/Solaris 9? Thanks. (2 Replies)
Discussion started by: GMMike
2 Replies

3. Shell Programming and Scripting

efficiently split a 2GB text file into two

Can an expert kindly write an efficient Linux ksh script that will split a large 2 GB text file into two? Here is a couple of sample record from that text file: "field1","field2","field3",11,22,33,44 "TG","field2b","field3b",1,2,3,4 The above rows are delimited by commas. This script is to... (2 Replies)
Discussion started by: ihot
2 Replies

4. UNIX for Dummies Questions & Answers

MAX file size limited to 2GB

Hi All, We are running HP rp7400 box with hpux 11iv1. Recently, we changed 3 kernel parameters a) msgseg from 32560 to 32767 b) msgmnb from 65536 to 65535 c) msgssz from 128 to 256 Then we noticed that all application debug file size increase upto 2GB then it stops. So far we did not... (1 Reply)
Discussion started by: mhbd
1 Replies

5. AIX

Creating > 2GB file

I am trying to execute a database dump to a file, but can't seem to get around the 2GB file size. I have tried setting the user limit to -1, but no luck. (4 Replies)
Discussion started by: markper
4 Replies

6. Linux

unzipping file > 2gb

I am not able to unzip file greater then 2gb, Any suggestions how to do that in linux? Regards, Manoj (5 Replies)
Discussion started by: manoj.solaris
5 Replies

7. UNIX for Advanced & Expert Users

How to create a file more than 2GB

Hi, I am executing a SQL query and the output is more than 2GB. Hence the process is failing. How can I have a file created more than 2GB ? Thanks, Risshanth (1 Reply)
Discussion started by: risshanth
1 Replies

8. HP-UX

2GB file size limit

Greetings, I'm attempting to dump a filesystem from a RHEL5 Linux server to a VXFS filesystem on an HP-UX server. The VXFS filesystem is large file enabled and I've confirmed that I can copy/scp a file >2GB to the filesystem. # fsadm -F vxfs /os_dumps largefiles # mkfs -F vxfs -m... (12 Replies)
Discussion started by: bkimura
12 Replies

9. UNIX for Dummies Questions & Answers

Delete the file which crossed 2GB

Hi , I wants to create the bash script for deleting the specified 2gb file and wants to take the backup before doing that. please help me how to do the same,I use RHEL5 server (22 Replies)
Discussion started by: Rahulne25
22 Replies

10. Fedora

/var/log/btmp size 2.2Gb daily

Hello, One Fedora server is facing the issue that daily /var/log/btmp grows to 2.2Gb or more. I need your help to determine the cause and isolate it. Thank you! (6 Replies)
Discussion started by: feroccimx
6 Replies
LWP-DOWNLOAD(1) 					User Contributed Perl Documentation					   LWP-DOWNLOAD(1)

NAME
lwp-download - Fetch large files from the web SYNOPSIS
lwp-download [-a] [-s] <url> [<local path>] DESCRIPTION
The lwp-download program will save the file at url to a local file. If local path is not specified, then the current directory is assumed. If local path is a directory, then the last segment of the path of the url is appended to form a local filename. If the url path ends with slash the name "index" is used. With the -s option pick up the last segment of the filename from server provided sources like the Content- Disposition header or any redirect URLs. A file extension to match the server reported Content-Type might also be appended. If a file with the produced filename already exists, then lwp-download will prompt before it overwrites and will fail if its standard input is not a terminal. This form of invocation will also fail is no acceptable filename can be derived from the sources mentioned above. If local path is not a directory, then it is simply used as the path to save into. If the file already exists it's overwritten. The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you don't have much options to worry about. Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems. EXAMPLE
Fetch the newest and greatest perl version: $ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz Saving to 'latest.tar.gz'... 11.4 MB received in 8 seconds (1.43 MB/sec) AUTHOR
Gisle Aas <gisle@aas.no> perl v5.12.1 2010-07-05 LWP-DOWNLOAD(1)
All times are GMT -4. The time now is 11:31 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy