Sponsored Content
Full Discussion: Tar files larger than 2GB
Operating Systems AIX Tar files larger than 2GB Post 302453250 by 116@434 on Tuesday 14th of September 2010 02:15:00 PM
Old 09-14-2010
Few versions of tar has -z/ -Z option to directly create a zipped/ Compressed tar. U can try that out. it might work.
 

10 More Discussions You Might Find Interesting

1. Filesystems, Disks and Memory

Use of unzip with content files > 2Gb

I am zipping and downloading zip files from an AS400 using the unzip utility. The files are being downloaded onto a Solaris box. Some of the content files in the zip are larger than 2GB. When using the unzip utility (version 5.32), it complains of 'disk full'. The disk is not full, I still have... (2 Replies)
Discussion started by: tcarlson
2 Replies

2. UNIX for Advanced & Expert Users

sending larger files via ftp

hi all, i am looking for ways to make ftp efficient by tuning the parameters currently, tcp_max_buf is 1 MB tcp_xmit_hiwat is 48 KB say to transmit multiple 2 gb files from unix server to mainframe sys, will increasing the window size or the send buffer size of the current TCP/IP... (6 Replies)
Discussion started by: matrixmadhan
6 Replies

3. Shell Programming and Scripting

cpio - files > 2gb

Hi, Currently a backup script copies compressed files to tape using cpio command (on AIX 5.2). Recently we've had a compressed file which has gone over 2 GB in size resulting in an error while copying this file onto the tape using cpio. Any suggestions on relevant workarounds would be much... (2 Replies)
Discussion started by: dnicky
2 Replies

4. Filesystems, Disks and Memory

tar 2GB limit

Any idea how to get around this limit? I have a 42GB database backup file (.dmp) taking up disk space because neither tar nor cpio are able to put it onto a tape. I am on a SUN Solaris using SunOS 5.8. I would appreciate whatever help can be provided. Thanks! (9 Replies)
Discussion started by: SLKRR
9 Replies

5. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

6. Linux

Compress files >2GB

Hi folks, I'm trying to compress a certain number of files from a cifs mount to a xfs mount, but cannot do it when the total size of the files is bigger than 2GB. Is there any limitation for above 2GB? OS is SLES 64bit The files are maximum 1MB, so there are aprox. 2000 files to compress... (2 Replies)
Discussion started by: xavix
2 Replies

7. Shell Programming and Scripting

tar command to explore multiple layers of tar and tar.gz files

Hi all, I have a tar file and inside that tar file is a folder with additional tar.gz files. What I want to do is look inside the first tar file and then find the second tar file I'm looking for, look inside that tar.gz file to find a certain directory. I'm encountering issues by trying to... (1 Reply)
Discussion started by: bashnewbee
1 Replies

8. UNIX for Dummies Questions & Answers

Using UNIX Commands with Larger number of Files

Hello Unix Gurus, I am new to Unix so need some help on this. I am using the following commands: 1) mv -f Inputpath/*. outputpath 2) cp Inputpath/*. outputpath 3) rm -rf somepath/* 4) Find Inputpath/*. Now I get the following error with... (18 Replies)
Discussion started by: pchegoor
18 Replies

9. Shell Programming and Scripting

Backingup larger files with TAR command

I need to backup my database but the files are very large and the TAR command will not let me. I searched aids and found that I could do something with the mknod, COMPRESS and TAR command using them together. I appreciate your help. (10 Replies)
Discussion started by: frizcala
10 Replies

10. UNIX for Beginners Questions & Answers

Need to select files larger than 500Mb from servers

I need help modifying these two scripts to do the following: - print files in (MB) instead of (KB) - only select files larger than 500MB -> these will be mailed out daily - Select all files regardless of size all in (MB) -> these will be mailed out once a week this is what i have so far and... (5 Replies)
Discussion started by: donpasscal
5 Replies
CURLOPT_MAXFILESIZE(3)					     curl_easy_setopt options					    CURLOPT_MAXFILESIZE(3)

NAME
CURLOPT_MAXFILESIZE - maximum file size allowed to download SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_setopt(CURL *handle, CURLOPT_MAXFILESIZE, long size); DESCRIPTION
Pass a long as parameter. This allows you to specify the maximum size (in bytes) of a file to download. If the file requested is found larger than this value, the transfer will not start and CURLE_FILESIZE_EXCEEDED will be returned. The file size is not always known prior to download, and for such files this option has no effect even if the file transfer ends up being larger than this given limit. This concerns both FTP and HTTP transfers. If you want a limit above 2GB, use CURLOPT_MAXFILESIZE_LARGE(3). DEFAULT
None PROTOCOLS
FTP and HTTP EXAMPLE
TODO AVAILABILITY
Always RETURN VALUE
Returns CURLE_OK SEE ALSO
CURLOPT_MAXFILESIZE_LARGE(3), CURLOPT_MAX_RECV_SPEED_LARGE(3), libcurl 7.54.0 February 03, 2016 CURLOPT_MAXFILESIZE(3)
All times are GMT -4. The time now is 02:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy