Sponsored Content
Full Discussion: Compress files >2GB
Operating Systems Linux Compress files >2GB Post 302303776 by xavix on Friday 3rd of April 2009 12:06:37 PM
Old 04-03-2009
It worked!

Now I move the files as:
Code:
find $ORIG -type f -mtime +30 -print | xargs tar cvjf $DEST/$TARFILE | rm

Thanks!
 

10 More Discussions You Might Find Interesting

1. Filesystems, Disks and Memory

Use of unzip with content files > 2Gb

I am zipping and downloading zip files from an AS400 using the unzip utility. The files are being downloaded onto a Solaris box. Some of the content files in the zip are larger than 2GB. When using the unzip utility (version 5.32), it complains of 'disk full'. The disk is not full, I still have... (2 Replies)
Discussion started by: tcarlson
2 Replies

2. UNIX for Dummies Questions & Answers

How to Compress log files?

Hi, I have my log files in /home/user1/temp2/logs i want to archive *.log and *.txt files and to store in my /home/user1/temp2/archved/ with *.log with Time stamp ,Please let me know how to do this? (1 Reply)
Discussion started by: redlotus72
1 Replies

3. Shell Programming and Scripting

cpio - files > 2gb

Hi, Currently a backup script copies compressed files to tape using cpio command (on AIX 5.2). Recently we've had a compressed file which has gone over 2 GB in size resulting in an error while copying this file onto the tape using cpio. Any suggestions on relevant workarounds would be much... (2 Replies)
Discussion started by: dnicky
2 Replies

4. UNIX for Advanced & Expert Users

Problem creating files greater than 2GB

With the C code I am able to create files greater than 2GB if I use the 64 bit compile option -D_FILE_OFFSET_BITS=64. There I am using the function fprintf to write into the file. But when I use C++ and ofstream the file is getting truncated when the size grows beyond 2GB. Is there any special... (1 Reply)
Discussion started by: bobbyjohnz
1 Replies

5. UNIX for Dummies Questions & Answers

Compress files

Hi All, I would like to archive some of the scripts below(USFINUM042006_01.CSV USFINUM042006_02.CSV and USFINUM042006_03.CSV )and also use a wildcard e.g. <command> USFINUM*.CSV. Also there are a lot of similar files but I want only the three latest files to be compressed. Which is the best... (3 Replies)
Discussion started by: indira
3 Replies

6. UNIX for Dummies Questions & Answers

how to compress html files

Hello, On a Centos 5.0 server, Apache 2.2 delivers static html page. How could I compress those html pages to gain speed and save bandwidth? is there a utility that would be effective and save? Thanks (2 Replies)
Discussion started by: JCR
2 Replies

7. Filesystems, Disks and Memory

Compress files on NAS

Hello, I am having difficulty compressing the files using compress or GZIP utility on NAS share NFS mounted on my linux server. Any one have idea on how to do this ? I get the followign error but the trying to compress the files STRP2> compress STR_OUTBOUND_CDM_LOG_LOB.PRT2008_26.txt... (0 Replies)
Discussion started by: kamathg
0 Replies

8. Shell Programming and Scripting

compress files

Could someone give me an idea how to compress all files from a given directory that are not of type .z (compressed). Please help. (2 Replies)
Discussion started by: lesstjm
2 Replies

9. UNIX for Dummies Questions & Answers

Issue: Compress in unix server and FTP to windows and open the compress file using Winzip

Hi All ! We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows. I have used the utility ZIP like the below. zip -e <newfilename> df2_test_extract.dat but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Discussion started by: sakthifire
4 Replies

10. AIX

Tar files larger than 2GB

Hi, Does anyone know if it is possible to tar files larger than 2GB? The reason being is they want me to dump a single file (which is around 20GB) to a tape drive and they will restore it on a Solaris box. I know the tar have a limitation of 2GB so I am thinking of a way how to overcome this.... (11 Replies)
Discussion started by: depam
11 Replies
tar(n)								 Tar file handling							    tar(n)

__________________________________________________________________________________________________________________________________________________

NAME
tar - Tar file creation, extraction & manipulation SYNOPSIS
package require Tcl 8.4 package require tar ?0.4? ::tar::contents tarball ::tar::stat tarball ?file? ::tar::untar tarball args ::tar::get tarball fileName ::tar::create tarball files args ::tar::add tarball files args ::tar::remove tarball files _________________________________________________________________ DESCRIPTION
::tar::contents tarball Returns a list of the files contained in tarball. The order is not sorted and depends on the order files were stored in the archive. ::tar::stat tarball ?file? Returns a nested dict containing information on the named ?file? in tarball, or all files if none is specified. The top level are pairs of filename and info. The info is a dict with the keys "mode uid gid size mtime type linkname uname gname devmajor devminor % ::tar::stat tarball.tar foo.jpg {mode 0644 uid 1000 gid 0 size 7580 mtime 811903867 type file linkname {} uname user gname wheel devmajor 0 devminor 0} ::tar::untar tarball args Extracts tarball. -file and -glob limit the extraction to files which exactly match or pattern match the given argument. No error is thrown if no files match. Returns a list of filenames extracted and the file size. The size will be null for non regular files. Leading path seperators are stripped so paths will always be relative. -dir dirName Directory to extract to. Uses pwd if none is specified -file fileName Only extract the file with this name. The name is matched against the complete path stored in the archive including directo- ries. -glob pattern Only extract files patching this glob style pattern. The pattern is matched against the complete path stored in the archive. -nooverwrite Dont overwrite files that already exist -nomtime Leave the file modification time as the current time instead of setting it to the value in the archive. -noperms In Unix, leave the file permissions as the current umask instead of setting them to the values in the archive. % foreach {file size} [::tar::untar tarball.tar -glob *.jpg] { puts "Extracted $file ($size bytes)" } ::tar::get tarball fileName Returns the contents of fileName from the tarball % set readme [::tar::get tarball.tar doc/README] { % puts $readme } ::tar::create tarball files args Creates a new tar file containing the files. files must be specified as a single argument which is a proper list of filenames. -dereference Normally create will store links as an actual link pointing at a file that may or may not exist in the archive. Specifying this option will cause the actual file point to by the link to be stored instead. % ::tar::create new.tar [glob -nocomplain file*] % ::tar::contents new.tar file1 file2 file3 ::tar::add tarball files args Appends files to the end of the existing tarball. files must be specified as a single argument which is a proper list of filenames. -dereference Normally add will store links as an actual link pointing at a file that may or may not exist in the archive. Specifying this option will cause the actual file point to by the link to be stored instead. ::tar::remove tarball files Removes files from the tarball. No error will result if the file does not exist in the tarball. Directory write permission and free disk space equivalent to at least the size of the tarball will be needed. % ::tar::remove new.tar {file2 file3} % ::tar::contents new.tar file3 BUGS, IDEAS, FEEDBACK This document, and the package it describes, will undoubtedly contain bugs and other problems. Please report such in the category tar of the Tcllib SF Trackers [http://sourceforge.net/tracker/?group_id=12883]. Please also report any ideas for enhancements you may have for either package and/or documentation. KEYWORDS
archive, tape archive, tar tar 0.4 tar(n)
All times are GMT -4. The time now is 02:36 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy