Moving extremely large number of files to destination


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Moving extremely large number of files to destination
# 8  
Old 03-23-2009
Thanks, Methyl.

It was good to explore about uname & shell. I just knew about uname -a.

Coming to back on the trace -
All the files under same directory - on the same mound point. None of the file has funny characters like *,&,etc..

Your analysis are correct - for the size & numbers.. So let's continue with that -

I want to compress them all. -exec will certainly not work..
Code:
find ./ -type f -mtime +1176 | gzip -c > temp.gz

Above stuff is not sufficient for millions of files... It creates temp.gz but I am not sure -- WHETHER THIS WILL WORK WITH THOUSANDS OF FILES OR NOT?

Even below code is not what serves the problem -

Code:
find ./ -type f -mtime +1176 -print | xargs -n1 -i tar -cvf {}


Creating a temp file - which contains name of all the files & then accessing it to compress the files - look a long way around to me. Cant we do it in single line command?

Very similar question is posted in my different thread -
https://www.unix.com/unix-dummies-que...es-one-go.html

It's good if you can migrate - sorry for the confusions - Actually the tasks are different but associated here, so ..

(You may reply on the new thread or here as well , i am just concerned about solution !! Smilie)

Thanks,
Kedar
# 9  
Old 03-23-2009
Code:
find . -type f -mtime +1176 | 
  xargs tar cf - | gzip > all_"$(date +%F)".tgz

On a GNU system:

Code:
find -type f -mtime +1176 -print0 | 
  xargs -0 tar c | bzip2 > all_"$(date +%F)".tbz2


Last edited by radoulov; 03-23-2009 at 05:56 PM.. Reason: corrected ...
# 10  
Old 03-23-2009
Thanks, Radoulov.

Will try this & let you know - if i am stuck up any where..
# 11  
Old 03-24-2009
Quote:
Originally Posted by radoulov
Code:
find . -type f -mtime +1176 | 
  xargs tar cf - | gzip > all_"$(date +%F)".tgz

On a GNU system:
1) Will this delete existing files or that I will have to do manually?
(It would be good if this command can delete as well - otherwise i will have to fire this costly operation - find - once again to delete them. Although not a big problem..)

2) I am not able to untar this directly. what's the way to check if this tar is safe (okay) ?

Below command returns me errors..
Code:
tar -tvf older_then_2008_2.tgz
tar: directory checksum error

I need to make sure - the files are still good - before i delete uncompressed files...

Thanks!
Kedar
# 12  
Old 03-24-2009
Quote:
Originally Posted by kedar.mehta
1) Will this delete existing files or that I will have to do manually?
(It would be good if this command can delete as well - otherwise i will have to fire this costly operation - find - once again to delete them. Although not a big problem..)
No, it will only archive the files. You should use another command to remove them:

Code:
find . -type f -mtime +1176 -exec rm +

If your find implementation does not support the + operator,
use xargs:

Code:
find . -type f -mtime +1176 | 
  xargs rm

Quote:
2) I am not able to untar this directly. what's the way to check if this tar is safe (okay) ?

Below command returns me errors..
Code:
tar -tvf older_then_2008_2.tgz
tar: directory checksum error

I need to make sure - the files are still good - before i delete uncompressed files...
You should use something like this:

Code:
gzip -dc older_then_2008_2.tgz | tar -tvf -

# 13  
Old 03-24-2009
We really need to know what Operating System you are running and which is your preferred Shell.
Many unixes will not deal with single files above 2Gb - especially in tar. There may be other limitations in your OS which will require breaking the task down into manageable units.
# 14  
Old 03-24-2009
hi,

the error is related to the largefile size , your mv command is failing when the file size more then 2 gb.

please be advised that your mv commad coming from large file aware directory..
type which mv ------> to get the current location of mv. if /bin
you can try /usr/local/bin/mv (exclusive path in find command)

instead of mv you have to specify the exclusive path.

i had this problem and corrected it this way.
you can check the man page of solaris for ---largefile
your file size is 5+gb.
hope it help
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Count the number of files copied from source to destination location

Hi Guys, how to count number of files successfully copied while coping files from source to destination path ex:10 files from source to target location copying if 8 files copied successfully then echo successfully copied=8 failure=2 files if two files get error to coping files from... (23 Replies)
Discussion started by: sravanreddy
23 Replies

2. Shell Programming and Scripting

Sftp large number of files

Want to sftp large number of files ... approx 150 files will come to server every minute. (AIX box) Also need make sure file has been sftped successfully... Please let me know : 1. What is the best / faster way to transfer files? 2. should I use batch option -b so that connectivity will be... (3 Replies)
Discussion started by: vegasluxor
3 Replies

3. Shell Programming and Scripting

Moving and renaming large ammount of files

Hey, I'm kinda new to the shell scripting and I don't wanna mess things up yet :) Looking for a solution to the following: I need to move all the files like "filename.failed.dateandtime" to another directory also renaming them "filename.ready". I can't figure how to do this with multiple files... (4 Replies)
Discussion started by: sg3
4 Replies

4. UNIX for Dummies Questions & Answers

Moving Multiple files to destination files

I am running a code like this foreach list ($tmp) mv *_${list}.txt ${chart}_${list}.txt #mv: when moving multiple files, last argument must be a directory mv *_${list}.doc ${chart}_${list}.doc #mv: when moving multiple files, last argument must be a... (3 Replies)
Discussion started by: animesharma
3 Replies

5. Programming

Working with extremely large numbers in C

Hi All, I am just curious, not programming anything of my own. I know there are libraries like gmp which does all such things. But I really need to know HOW they do all such things i.e. working with extremely large unimaginable numbers which are beyond the integer limit. They can do add,... (1 Reply)
Discussion started by: shoaibjameel123
1 Replies

6. UNIX for Dummies Questions & Answers

Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them. In the file, each line is a full path to a file: /path/to/the files/file1 /path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies

7. Shell Programming and Scripting

Concatenation of a large number of files

Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example VOICE_0000000000 VOICE_1223o23u0 VOICE_934934927349 I use the following code: cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE ... (10 Replies)
Discussion started by: chriss_58
10 Replies

8. UNIX for Dummies Questions & Answers

questing regarding tar large number of files

I want to tar large number of files about 150k. i am using the find command as below to create a file with all file names. & then trying to use the tar -I command as below. # find . -type f -name "gpi*" > include-file # tar -I include-file -cvf newfile.tar This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies

9. Shell Programming and Scripting

Script to Compare a large number of files.

I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc. What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up. its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies

10. Shell Programming and Scripting

moving large number of files

I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames I tried the following logics (1) find . -name \*.dat > list for i in `cat list` do mv $i test/ done (2) cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies
Login or Register to Ask a Question