Bash script to check the files based up on UST time.
i have 3 regions like AWS,EMEA,APJ and i use to get 3 files like a,b,c files at 3 am ust for AWS region in common shared path and x,y,z files At 10 am ust for EMEA and 1,2,3,4,5 files at 11 pm UST for APJ region. In this files name wont change daily it remain same but the file name is not same for region to region and i need one script to check region wise weather all files are there or not if the files are not there it has to mail the missing file name and it should come in single script and we are using tidal to schedule the script.
So you want to verify:
3 files arrive at 03:00 UST for AWS (A,B,C)
3 files arrive at 10:00 UST for EMEA (X,Y,Z)
5 files arrive at 23:00 UST for APJ (e,f,g,h,i)
Questions:
Do any of the files ever arrive late or early?
A lot more information would be useful. And I do not know tidal syntax. You will have to come up with that
Or your UNIX OS or any actual file names.
Do the files get deleted before the next time they arrive?
The OS is REALLY important. If I were to give you a solution that works on all "platforms everywhere answer" it will not be optimal
Logic:
Now answer some questions so we can help you.
How big (roughly) are these files? Might it be that when you check for them half an hour later (as colleague jim mcnamara suggested) they are only partially transmitted?
Speaking about partial transmission? Is there any indication that the transmission has finished AND is complete? Could it be that only half of a file gets transmitted and this part might then be unusable? What should the script do in such a case?
I created this script for check whether specific files exist or not in the given location. but when I run this its always showing
Failed - Flag_lms_device_info_20160628.txt do not exist
Failed - Flag_lms_weekly_usage_info_20160628.txt do not exist
but both files are existing. appreciate help... (2 Replies)
Hi All,
I have scenario where I need to zip huge number of DB audit log files newer than 90 days and delete anything older than that. If the files are too huge in number,zipping will take long time and causing CPU spikes. To avoid this I wanted to segregate files based on how old they are and... (2 Replies)
Hey guys,
Sorry for the basic question but I have a lot of files that I want to separate into groups based on filenames which I can then cat together. Eg I have:
(a_b_c.txt)
WB34_2_SLA8.txt
WB34_1_SLA8.txt
WB34_1_DB10.txt
WB34_2_DB10.txt
WB34_1_SLA8.txt
WB34_2_SLA8.txt
77_1_SLA8.txt... (1 Reply)
Hi friends,
I am trying to check if the two files i am expecting are created in a specific location.
if both files does not exist, do an echo
if -a ]; then
echo "files not found"
fi
It gives me the following message:
bash: Please help! :) (3 Replies)
Hi ,
please guide me for a bash script that will create a txt files and the name of the txt files will be as of timestamp so that each file name will be different from other and these files will be get created say after every 10 minutes in a folder(/home/p2000/sxs137), please guide me how would... (1 Reply)
Dear All,
Pls help me on this issue.
i want to write a script to check whether files updation happening in cuttent time or not.
i have set of files in directory which wil update in time basis..
Requirement: If the files are updating in system time i just want to print "files are... (6 Replies)
Hi,
I have a requirement ,let us say 1000 files needs to be transferred in an hour from one path to another path and if the files (1000 files) are transferred within an hour ( say 40 mins), then the process should remain idle for the remaining time ( 20 mins). (3 Replies)
Hi All,
I know the timestamp of a file. Now i would like to list all the files in the with the same time stamp in the same file.
Any help would be appreciated.
Thanks.
sunny (1 Reply)
Hi,
I have some log files created in the following fashion
Ex:
file name modified date
1) s.log1 01-jan-08
2) s.log2 02-jan-08
3) s.log3 03-jan-08
4) s.log4 04-jan-08
Now I want to have the latest 2 logs and delete the others.
Can you tell me the one liner /... (1 Reply)
Hi,
Whats the command for finding files older then 20mins. This has to be part of the find command as it will be part of a cleanup script.
thanks
Budrito (4 Replies)