How to tar large amount of files?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting How to tar large amount of files?
# 1  
Old 11-18-2010
How to tar large amount of files?

Hello

I have the following files
VOICE_hhhh
SUBSCR_llll
DEL_kkkk

Consider that there are 1000 VOICE files+1000 SUBSCR files+1000DEL files

When i try to tar these files using

tar -cvf backup.tar VOICE* SUBSCR* DEL*i get the error:

ksh: /usr/bin/tar: arg list too long
How can i overcome this?

thank you
# 2  
Old 11-18-2010
Maybe this:
Code:
find . -name VOICE* -o -name SUBSCR* -o -name DEL* | xargs -t tar -cvf backup.tar


Last edited by cabrao; 11-18-2010 at 08:30 AM..
# 3  
Old 11-18-2010
Assuming HPUX tar allows the u option
Code:
tar -cvf backup.tar VOICE* 
tar -uvf backup.tar SUBSCR* 
tar -uvf backup.tar DEL*

# 4  
Old 11-18-2010
Cabrao hi,

I used the following code:
Code:
find . -name VOICE* -o -name SUBSCR* -o -name DEL* -o -name EVENTS* -o -name CREDIT* -o -name RTCHANGE* -o -name GPRS* -o -name ACTIV* -o -name EXPR* | xargs -t tar -cvf backup.tar

But i got the following error:
Code:
find: missing conjunction
tar -cvf backup.tar

Regards

Last edited by pludi; 11-18-2010 at 08:54 AM..
# 5  
Old 11-18-2010
Try to quote your files:
Code:
find . -name "VOICE*" -o -name "SUBSCR*" -o -name "DEL*" -o -name "EVENTS*" -o -name "CREDIT*" -o -name "RTCHANGE*" -o -name "GPRS*"  -o -name "lACTIV*" -o -name "EXPR*" | xargs -t tar -cvf backup.tar

# 6  
Old 11-18-2010
What cabrao said, but you'll have to use single quotes ('), not double quotes ("), otherwise parameter expansion will still happen, and you'll get the same error again.
# 7  
Old 11-18-2010
Quote:
Originally Posted by cabrao
Code:
...| xargs -t tar -cvf backup.tar

I don't think this will work, because every invocation of tar will overwrite the previously created backup.tar file.

If your tar supports the -I flag you can try:

Code:
find . -name 'VOICE*' -o -name ... >filelist
tar cv -I filelist -f backup.tar

This User Gave Thanks to hergp For This Post:
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to make awk command faster for large amount of data?

I have nginx web server logs with all requests that were made and I'm filtering them by date and time. Each line has the following structure: 127.0.0.1 - xyz.com GET 123.ts HTTP/1.1 (200) 0.000 s 3182 CoreMedia/1.0.0.15F79 (iPhone; U; CPU OS 11_4 like Mac OS X; pt_br) These text files are... (21 Replies)
Discussion started by: brenoasrm
21 Replies

2. Shell Programming and Scripting

Perl : Large amount of data put into an array

This basic code works. I have a very long list, almost 10000 lines that I am building into the array. Each line has either 2 or 3 fields as shown in the code snippit. The array elements are static (for a few reasons that out of scope of this question) the list has to be "built in". It... (5 Replies)
Discussion started by: sumguy
5 Replies

3. UNIX for Dummies Questions & Answers

Compress 1st 20 Large different File using tar

Sorry guys.. dont know where to put this.. currently I am cleaning up system dump on our aix machine and I wanted to set zero the 1st 20 large file but before doing that I wanted to create some backup. is there any command that can compress all these file same time? tar syntax file? ... (2 Replies)
Discussion started by: thermometer
2 Replies

4. Shell Programming and Scripting

tar command to explore multiple layers of tar and tar.gz files

Hi all, I have a tar file and inside that tar file is a folder with additional tar.gz files. What I want to do is look inside the first tar file and then find the second tar file I'm looking for, look inside that tar.gz file to find a certain directory. I'm encountering issues by trying to... (1 Reply)
Discussion started by: bashnewbee
1 Replies

5. UNIX for Dummies Questions & Answers

questing regarding tar large number of files

I want to tar large number of files about 150k. i am using the find command as below to create a file with all file names. & then trying to use the tar -I command as below. # find . -type f -name "gpi*" > include-file # tar -I include-file -cvf newfile.tar This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies

6. Solaris

Tar too large to archive. Use E function modifier.

hey all, i am trying to tar up a folder with sub folders the over all size will be about 70gb but when i use the normal command tar -cvf tar -cvf CLPSI_PRU_Escrow_31994.tar CLPSI_PRU_Escrow_31994 i get an error tar: CLPSI_PRU_Escrow_31994/dump1/PROD_SAE_jria3_dump.5 too large to archive. ... (9 Replies)
Discussion started by: dshakey
9 Replies

7. AIX

amount of memory allocated to large page

We just set up a system to use large pages. I want to know if there is a command to see how much of the memory is being used for large pages. For example if we have a system with 8GB of RAm assigned and it has been set to use 4GB for large pages is there a command to show that 4GB of the *GB is... (1 Reply)
Discussion started by: daveisme
1 Replies

8. Programming

Read/Write a fairly large amount of data to a file as fast as possible

Hi, I'm trying to figure out the best solution to the following problem, and I'm not yet that much experienced like you. :-) Basically I have to read a fairly large file, composed of "messages" , in order to display all of them through an user interface (made with QT). The messages that... (3 Replies)
Discussion started by: emitrax
3 Replies

9. Linux

shmat() Failure While Using a Large Amount of Shared Memory

Hi, I'm developing a data processing pipeline with multiple stages, with data being moved between the stages using shared memory segments. The size of the data is typically of the order of hundreds of megabytes, and there are typically a few tens of main shared memory segments each of size... (2 Replies)
Discussion started by: theicarusagenda
2 Replies

10. UNIX for Dummies Questions & Answers

Using TAR on a large file

I am runnign Solaris 8. I am trying to TAR a large file (13 gig) and it is complaining about the size. Any suggestions? (4 Replies)
Discussion started by: hshapiro
4 Replies
Login or Register to Ask a Question