Deleting file basing on the timestamp substring in the file name
Hello,
I have in my backup folder, files with names convention like this :
etc....
Base on timestamp at end of the filename, I would to delete all the files which are older than 10 days.
I prefer to base on the timestamp in the file name end NOT on -mtime because those backup files can be access any time for reading.
Thanks for your reading and helps.
Thibault
Hello,
I have in my backup folder, files with names convention like this :
randomFileNames_13-02-2014_23h13m09+1392333189
randomFileNames_14-02-2014_02h13m09+1392343989
randomFileNames_14-02-2014_04h13m09+1392351189
etc....
Base on timestamp at end of the filename, I would to delete all the files which are older than 10 days.
I prefer to base on the timestamp in the file name end NOT on -mtime because those backup files can be access any time for reading.
Thanks for your reading and helps.
Thibault
Hmmm....
You mean something like:
This is based on the assumption that "randomFileNames" will not contain the character "+", otherwise the above might get ambiguous. You might use the above in a manner like:
This is based on the assumption that "randomFileNames" will not contain the character "+", otherwise the above might get ambiguous. You might use the above in a manner like:
I hope this helps.
bakunin
Thanks so much for your promt answer.
Yes the randomFileName does not contain the character "+".
May I ask you to explain me about " $TS -gt $some_int -a $TS -lt $other_int " .
Thanks so much.
Thibault
May I ask you to explain me about " $TS -gt $some_int -a $TS -lt $other_int " .
You might want to look at the man page of "test":
-lt: "lower than", checks if value1 is lower than value2
-a: "logical and" connects two conditions with an AND
-gt: "greater than", checks if value1 is greater than value2
So in fact it reads ...if $TS is greater than $some_int AND $TS is lower than $other_int, "some_int > TS > other_int", for short.
Hello,
Sorry for being late ( a lot of works ).
So your solution works great, but I have an other question relate to path ( something is related to the absolu path of the script ? ) , here my code :
Here the output :
So $TRASHDIR is no good but I know how to resolve this.
Any idea
I have an other question relate to path ( something is related to the absolu path of the script ? ) , here my code :
OK, first objection: you should definitely not use backticks any more. They are a B-A-D habit. Use the modern subshell command instead: "$(....)".
Perfect would be to type your variables:
The same goes for for the other variables. Personally (but that is only me) i use a sort-of "hungarian style notation" to keep track of what is in my variables: suffix "i" is for integers, "f" for files/paths, "ch" for characters/strings, etc..
I would write, for instance:
Even if i have two different representations for "today" (once a Unix time, once a day of the week) i can tell that "iToday" is an integer and "chToday" holds a string.
Quote:
Originally Posted by thuyetti
Don't do that. I don't mean the backticks this time (you shouldn't use them either, see above), but here you shouldn't forego the capability of any modern shell to deal with integers:
will do the same and with a lot less effort. It is also easier to read, IMHO.
Quote:
Originally Posted by thuyetti
Here the output :
So $TRASHDIR is no good but I know how to resolve this.
Any idea
Well, have a look at the output of ls -1 $BCK_DIR and tell me what you see. The list should look like this:
Do you see any path? I don't. So, inside your loop the variable "FILE" holds only a filename, not the complete path to it, yes? It will hold, for instance, filetest.txt_19-02-2014_17h58m33+1392829113, but not /Volumes/BACKUP/BCK_DATA/_ARCHIVES_/filetest.txt_19-02-2014_17h58m33+1392829113.
Now, you try to move this file to a certain other location. First problem: you can call this script from anywhere, but chances are the file you looked for in "$BCK_DIR" is not in the current directory. To make your script more robust use:
Now it will work from anywhere.
Second: you do not take care if "$TRASHDIR" is pointing to any valid directory and if you (or your script) is allowed to write to it. You should check that before you even attempt to move files there:
Alternatively you could try to create the directory, let the user specify another directory, do whatever - but you should definitely check every step your script undertakes. You do not have to correct every problem, but you should recognize and report it.
The same goes for accessibility: to check if the directory is there is not enough, you have to be able to access it:
Some (like me) take such cautiousness to religious heights. You could also check: availability of i-nodes, having enough space in the filesystem to copy files there, and a lot of things more. What i show you here is just the basics.
This looks tedious at first, but you will be rewarded with having absolutely unbreakable scripts which never produce an undefined state. They will always be clear about what went wrong and why they were unable to do their task.
bakunin,
Thanks again so much for taking young time to help me ( and all of the users here...).
It works like a charm now. Here my code :
I don't know why print does not work to output error, so I use echo ( does it matter ? )
And
---------- Post updated at 04:20 PM ---------- Previous update was at 04:16 PM ----------
Bakunin,
I learned so much thanks to you, I'll like to learn more about "modern shell".
Can you please point me to some clues where to start.
Thanks again
Thibault
Hi ,
I did the initial search but could not find what I was expecting for.
15606Always_9999999997_20160418.xml
15606Always_9999999998_20160418.xml
15606Always_9999999999_20160418.xml
9819Always_99999999900_20160418.xml
9819Always_99999999911_20160418.xmlAbove is the list of files I... (4 Replies)
Dear friends,
I have two files. One with all IDs(in a single field) . And another with data(of which say field 5 is ID). I want to create an array of IDs using first file and
while reading second file if the ID appears in the array I need to print $0 else skip.
After a long gap I am... (6 Replies)
In folder there are files
(eg ABS_18APR2012_XYZ.csv
DSE_17APR2012_ABE.csv) .
My requirement is to delete all the files except today's timestamp
I tried doing this to list all the files not having today's date timestamp
#!/bin/ksh
DATE=`date +"%d%h%Y"`
DIR=/data/rfs/... (9 Replies)
Hi All,
I am new to unix programming. I am trying for a requirement and the requirement goes like this.....
I have a test folder. Which tracks log files. After certain time, the log file is getting overwritten by another file (randomly as the time interval is not periodic). I need to preserve... (2 Replies)
So, I know how to do some of this stuff on an individual level, but I'm drawing a blank as to how to put it all together.
I have a pattern that I'm looking for in a log file. The log file I know came in yesterday, so I want to limit the search to that day's listing of files. How would I do... (5 Replies)
Hi folks,
Happy new year.
I have a file 'filename' that i wd like to split basing on the contents in the last column.
The 'filename' content looks like
256772744788,9,11
256772744805,9,11
256772744792,9,11
256775543055,10,12
256782625357,9,12
256772368953,10,13
256772627735,10,13... (3 Replies)
Hi all,
I have a binary file (orig.dat) and two special delimiter strings 'AAA' and 'BBB'. My binary file's content is as follow:
<Data1.1>AAA<Data1.2>BBB
<Data2.1>AAA<Data2.2>BBB
...
<DataN.1>AAA<DataN.2>BBB
DataX.Y might have any length, and contains any kind of special/printable... (1 Reply)
I am using UNIX from few months, I want to delete files in subdirectories that have a timestamp till yesterday. I mean all the files before Jan 10th...
can I just give find and do it or how do I do it??? (2 Replies)
Hi,
I'm trying to compare Actual.html with a baseline.html
However, everytime it fails b'coz of the timestamp differences
between the two. So, thought of stripping off the timestamp
from both the *html files before comparing using below sed
command over Solaris Unix platform:... (3 Replies)