I want to compress all the files which are three years older ..I have thousands of files...
1) This doesnt work
find ./ -type f -mtime +1176 -print | xargs -n1 -i tar -cvf {}
Errror
tar: Missing filenames
Probably because of -
find ./ -type f -mtime -1 -print returns -
"
./temp.txt"
First line is empty, may be that's the problem.
To experiment, i just tried -
"find ./ | grep "txt" | xargs -n1 -i tar -cvf {}"
"find ./ | grep "txt" |" returns no null characters. Still it's NOT working !!
2)
tar -cvf find ./ -type f -mtime +1176 -print
tar -cvf `find ./ -type f -mtime -1`
none of these working.
I get output as -
"a .// 0K
a . 0K
a .//temp.txt 0K
tar: .//find same as archive file
tar: -type: No such file or directory
tar: f: No such file or directory
tar: -mtime: No such file or directory
tar: -1: No such file or directory
tar: -print: No such file or directory"
3) The only thing working is -
"find ./ -type f -mtime +1176 | gzip -c > temp.gz"
It creates temp.gz but I am not sure -- WHETHER THIS WILL WORK WITH THOUSANDS OF FILES OR NOT?
one more problem is i cant unzip the files -
" gzip -dc temp.gz"
Returns me-
"./temp.gz
./find
./temp
./temp2.txt"
The other point is that tar(1) on its own does not compress it simply concatonates files together which will save wasted space if the files are smaller than the blocksize of the filesystem and would reduce the number of inodes used, but will not compress like gzip(1)ing them would, unless this is an extended version of tar that supports the -z option of course!
The find -type -f mentioned by cbo0485 means that find does not list directories only files.
Last edited by TonyFullerMalv; 03-23-2009 at 04:05 PM..
I didn't get you actually.. Is your code (quoted ) going to process alll the files' list? Do you mean my code will get only one file compressed? m Bit confused .
Quote:
Originally Posted by cbo0485
That won't zip them at the exact same time, but it will filter through the results and zip them up one at a time.
The problem is with exec I believe..I got thousands of files.. (Total size 50 Gigs) I think, this code will still get me the same errors. "Too many arguments". I think, xargs is required over here.
The bottom line question is, I got thousands of files, which I can obtain using some find command. Now I want to compress them all.
(Just an added info -All the files under same directory - on the same mound point. None of the file has funny characters like *,&,etc)
@Tony,
Thanks for ellaborating on tar & gzip. It was definately some value addition! I got the point about type -f (no directories).
The problem is the same old thing -
I want to compress list of files at a time. I got thousands of file , so -exec is running short to handle that large no of argument.
I can first copy the name of files into a temp file & use it with gzip, but why to do so if i can do it in one line command ? (if I can do so). The volume is such large that it will take 1/2 hours just to list all the required files !
Please help me to get though this problem.. Plz post the syntax /code as well.
If you have a "too many files" problem then break it down so that you put a certain number of files into a tar file, gzip the tar file, delete those files then repeat: I would script it with something like this:
I have not been able to test this (not having thousands of files to hand!) so I suggest putting an "ls" in place of the "rm" to start with and you may have to reduce the 16000 to get the tar to work.
I hope you do not have filenames with spaces in!
If this were running on Solaris I could do tar cvf tarfile -I /tmp/listfile and filenames with spaces in would not be a problem...
Last edited by TonyFullerMalv; 03-23-2009 at 09:34 PM..
Thanks to you both of you guys : frozentin & TonyFullerMalv. I am feeling bit reluctant (actually lazy) to go for looping & scripting, if we can do it with a single command line...
Good afternoon friends.
I wanted to make a query, how to compress several files and leave them all in 1, for example
flat text files:
filename_1.csv
filename_2.csv
filename_3.csv
expected result
filename_end.gzip = (filename_1.csv
filename_2.csv
filename_3.csv)
please (2 Replies)
I would like to compress the files in multiple directories. For some reason, it only compress the first directory (/Sanbox/logs1) but not the rest of the other directories ("/Sanbox/logs2" "/Sanbox/logs3" "/Sanbox/logs4" ). Any help would be appreciated. Here's my code:
#!/bin/bash... (1 Reply)
I'd really appreciate if anyone could assist me with this code
A directory with multiple subdirectories has multiple files which are timestamp'ed.
We need to
- compress files as per timestamp
- save compressed file/s in the respective folder
- delete the source files
============... (2 Replies)
Hi All !
We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows.
I have used the utility ZIP like the below.
zip -e <newfilename> df2_test_extract.dat
but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Could someone please help?
I'm trying to compress all the files in a directory without extension. I know for typical files with extension, the command is something like:
tar -zcvf file.tar.gz *.doc
What is the command for files without extension? Thanks. (4 Replies)
Hi!
First off I'd like to stress that I'm a true dummy :)
I have a website with SSH access and since it has user generated content I want to backup my website every day end send it through FTP to a different server. I got it working for my mysql database, so the only thing remaining are the... (2 Replies)
Hi All,
I would like to archive some of the scripts below(USFINUM042006_01.CSV
USFINUM042006_02.CSV and USFINUM042006_03.CSV )and also use a wildcard e.g. <command> USFINUM*.CSV. Also there are a lot of similar files but I want only the three latest files to be compressed. Which is the best... (3 Replies)
Hi Friends,
Can anyone help me out with compressing multiple files.
I have multiple files in directory , I have to compress these into a single file,
I tried using
gzip -r outfile.gz file1 file2 file3.
It is not working
Thanks in advance for your help
S :) (5 Replies)
Hi gurus,
I have a batch job running daily night automatically. The job produces file with extension '.0' like a.0, b.0 etc.
Now due to file space constraints, most of the time, the job fails with insufficient disk space and then we have to manually start the job again and keep running the... (1 Reply)