Compress and logging files from October 6th in a loop
Good night, i need your help please
Because there are about 10000 files from October 6th, i need to to compress but i use this command and it does not do anything, in the prompt has no respones and i have to press CRTL+C to goback to the shell
but if i list them, there are files from October 6th
There must be a syntax error or somethink ?
The OS is SunOS
I appreciate your help in advanced
------ Post updated at 09:44 PM ------
I found the problem but i face this while overriding:
Looks like the files in your current working directory are not unique.
Is this true ?
Do you have both (in one moment in time, just an example) :
In working directory when trying to compress ?
Initial problem you found is at the end of following line :
If you found the answer yourself, it is always nice to point it out and describe, so others may benefit.
Yes thank you it was a silly mistake i made, the initial problem was:
yes there are not unique there are some from september.
but neither i dont want to overwrite them nor to be asked to overwrite, so i implemented this to skip if it really exists and redirect to a lof file the gzipped files:
But its kind of a hassle because at least there are nearly 10000 files to gzip and log.
so is there any better approach to enhance this script and run faster?
So i do not rewrite from scratch, seems Scrutinizer already wrote some good code to run parallel gzips, with examples and multiple scenarios.
Of course, it's not one to fit all, you will have to adjust it to meet your needs a bit.
Post here if you get stuck with it or have any questions.
Personally, I would avoid reading the output of ls -l because you could get false positives too easily. Maybe another approach might be better:-
Remove the dark read echo if you are happy that it should work.
Good evening:
Need your help please
Before deleting older logs from octobre 4th i need to identify which files i have to remove, for instance today is octobre 8ht, i need to remove files from October 4th
Because there are many files i can not list because error args too long, so
... (5 Replies)
Here's a sample of the data:
NAME BIRTHDAY SEX LOCATION AGE ID
Jim 05/11/1986 M Japan 27 86
Rei 08/25/1990 F Korea 24 33
Jane 02/24/1985 F India 29 78
I've been trying to sort files using the... (8 Replies)
Hi ALL,
I am working on a folder where there are lot of files for the past one year. I need to compress a particular month files alone. suppose i need to compress the feb month files alone, what is the script we can use.
Thanks in advance (2 Replies)
Hi All !
We have to compress a big data file in unix server and transfer it to windows and uncompress it using winzip in windows.
I have used the utility ZIP like the below.
zip -e <newfilename> df2_test_extract.dat
but when I compress files greater than 4 gb using zip utility, it... (4 Replies)
Hi folks,
I'm trying to compress a certain number of files from a cifs mount to a xfs mount, but cannot do it when the total size of the files is bigger than 2GB.
Is there any limitation for above 2GB?
OS is SLES 64bit
The files are maximum 1MB, so there are aprox. 2000 files to compress... (2 Replies)
Hello,
On a Centos 5.0 server, Apache 2.2 delivers static html page. How could I compress those html pages to gain speed and save bandwidth? is there a utility that would be effective and save?
Thanks (2 Replies)
Hi All,
I would like to archive some of the scripts below(USFINUM042006_01.CSV
USFINUM042006_02.CSV and USFINUM042006_03.CSV )and also use a wildcard e.g. <command> USFINUM*.CSV. Also there are a lot of similar files but I want only the three latest files to be compressed. Which is the best... (3 Replies)
Hi Friends,
Can anyone help me out with compressing multiple files.
I have multiple files in directory , I have to compress these into a single file,
I tried using
gzip -r outfile.gz file1 file2 file3.
It is not working
Thanks in advance for your help
S :) (5 Replies)